sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
04c03e51e903349b0e7ae12b967288cc5d6b68de | Alimustoofaa/alpaca-indonesia-llama-1K | [
"region:us"
] | 2024-01-13T17:42:08+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 728489, "num_examples": 1000}], "download_size": 374413, "dataset_size": 728489}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T17:52:18+00:00 |
|
2bdf9b3711c5ffccedcd717e7519ff073c7dadb4 | P1ot3r/libri-val-en-whisper-small | [
"region:us"
] | 2024-01-13T17:43:27+00:00 | {"dataset_info": {"features": [{"name": "input_features", "sequence": {"sequence": "float32"}}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "validation", "num_bytes": 2596418544, "num_examples": 2703}], "download_size": 674059720, "dataset_size": 2596418544}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-13T17:45:48+00:00 |
|
b87dad1d5e99405375b0238d34a53acb4ef779b3 |
# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549679088142555,
"acc_stderr": 0.03191312416103038,
"acc_norm": 0.6559474034222305,
"acc_norm_stderr": 0.03256025642473883,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6649073889663414,
"acc_stderr": 0.0047105814966393374,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.0035425443194051424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964283,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964283
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649405,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649405
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8403575989782887,
"acc_stderr": 0.013097934513263005,
"acc_norm": 0.8403575989782887,
"acc_norm_stderr": 0.013097934513263005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490298,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729336,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377197,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515303
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.012919408108656423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B | [
"region:us"
] | 2024-01-13T17:44:12+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/Distilled-HermesChat-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549679088142555,\n \"acc_stderr\": 0.03191312416103038,\n \"acc_norm\": 0.6559474034222305,\n \"acc_norm_stderr\": 0.03256025642473883,\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6649073889663414,\n \"acc_stderr\": 0.0047105814966393374,\n \"acc_norm\": 0.8521210914160526,\n \"acc_norm_stderr\": 0.0035425443194051424\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964283,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964283\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649405,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649405\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8403575989782887,\n \"acc_stderr\": 0.013097934513263005,\n \"acc_norm\": 0.8403575989782887,\n \"acc_norm_stderr\": 0.013097934513263005\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n \"acc_stderr\": 0.015166544550490298,\n \"acc_norm\": 0.28938547486033517,\n \"acc_norm_stderr\": 0.015166544550490298\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729336,\n \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515303\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \"acc_stderr\": 0.012919408108656423\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_41_54.536456", "path": ["results_2024-01-13T17-41-54.536456.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-41-54.536456.parquet"]}]}]} | 2024-01-13T17:44:34+00:00 |
48ca86fe0da4800889ca8279b64abfa7f190b183 |
# Dataset of amazon/アマゾン/女将 (Azur Lane)
This is the dataset of amazon/アマゾン/女将 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, blue_eyes, ahoge, fang, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 46.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 29.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 100 | 59.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 42.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 100 | 77.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amazon_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/amazon_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, gloves, looking_at_viewer, solo, cape, open_mouth, sword, belt, black_thighhighs, smile, blush, pleated_skirt, uniform, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | looking_at_viewer | solo | cape | open_mouth | sword | belt | black_thighhighs | smile | blush | pleated_skirt | uniform | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------------|:-------|:-------|:-------------|:--------|:-------|:-------------------|:--------|:--------|:----------------|:----------|:-------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/amazon_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:46:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:56:27+00:00 |
f0f14b3e7269637ccb929ad354057370b4d07a38 |
# Dataset of voroshilov/ヴォロシーロフ/伏罗希洛夫 (Azur Lane)
This is the dataset of voroshilov/ヴォロシーロフ/伏罗希洛夫 (Azur Lane), containing 60 images and their tags.
The core tags of this character are `breasts, long_hair, blue_hair, large_breasts, bangs, purple_eyes, very_long_hair, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 107.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 52.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 157 | 114.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 90.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 157 | 170.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/voroshilov_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_thighhighs, cleavage, bare_shoulders, flower, garter_straps, earrings, thighs, blush, white_dress, covered_navel, wide_sleeves, cowboy_shot, white_leotard, fur-trimmed_coat, parted_lips, simple_background, white_background, open_coat |
| 1 | 23 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, cleavage, collarbone, wet, naked_towel, thighs, bare_shoulders, sitting, closed_mouth, onsen, water, parted_lips, red_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_thighhighs | cleavage | bare_shoulders | flower | garter_straps | earrings | thighs | blush | white_dress | covered_navel | wide_sleeves | cowboy_shot | white_leotard | fur-trimmed_coat | parted_lips | simple_background | white_background | open_coat | collarbone | wet | naked_towel | sitting | closed_mouth | onsen | water | red_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-----------|:-----------------|:---------|:----------------|:-----------|:---------|:--------|:--------------|:----------------|:---------------|:--------------|:----------------|:-------------------|:--------------|:--------------------|:-------------------|:------------|:-------------|:------|:--------------|:----------|:---------------|:--------|:--------|:-----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | | X | X | | | | X | X | | | | | | | X | | | | X | X | X | X | X | X | X | X |
| CyberHarem/voroshilov_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:46:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:00:58+00:00 |
e976c2a0c3ab3e4308bbcb8ea70b38703d451112 |
# Dataset of hermann_kunne/ヘルマン・キュンネ/Z19 (Azur Lane)
This is the dataset of hermann_kunne/ヘルマン・キュンネ/Z19 (Azur Lane), containing 26 images and their tags.
The core tags of this character are `black_hair, long_hair, hat, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 33.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hermann_kunne_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 18.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hermann_kunne_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 68 | 42.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hermann_kunne_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 28.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hermann_kunne_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 68 | 60.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hermann_kunne_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hermann_kunne_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, grey_eyes, looking_at_viewer, midriff, navel, solo, black_jacket, black_skirt, blunt_bangs, pleated_skirt, simple_background, very_long_hair, belt, black_headwear, crop_top, garter_straps, long_sleeves, open_jacket, peaked_cap, red_bowtie, smile, thighhighs, white_background, grey_shirt, open_mouth, outstretched_arms, suspenders, v-shaped_eyebrows |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | grey_eyes | looking_at_viewer | midriff | navel | solo | black_jacket | black_skirt | blunt_bangs | pleated_skirt | simple_background | very_long_hair | belt | black_headwear | crop_top | garter_straps | long_sleeves | open_jacket | peaked_cap | red_bowtie | smile | thighhighs | white_background | grey_shirt | open_mouth | outstretched_arms | suspenders | v-shaped_eyebrows |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:------------|:--------------------|:----------|:--------|:-------|:---------------|:--------------|:--------------|:----------------|:--------------------|:-----------------|:-------|:-----------------|:-----------|:----------------|:---------------|:--------------|:-------------|:-------------|:--------|:-------------|:-------------------|:-------------|:-------------|:--------------------|:-------------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hermann_kunne_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:46:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:53:34+00:00 |
1b443737fb9a454586f5636a8b5d489a2c70365f |
# Dataset of isokaze/磯風/矶风 (Azur Lane)
This is the dataset of isokaze/磯風/矶风 (Azur Lane), containing 39 images and their tags.
The core tags of this character are `animal_ears, green_hair, animal_ear_fluff, hair_ornament, long_hair, green_eyes, fang, thick_eyebrows, bangs, tail, hair_between_eyes, hairband, black_hairband, very_long_hair, fox_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 46.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 27.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 88 | 58.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 41.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 88 | 81.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/isokaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, :d, fur_trim, long_sleeves, looking_at_viewer, navel, open_clothes, open_mouth, solo, white_thighhighs, wide_sleeves, blush, claw_pose, hair_bell, jingle_bell, full_body, groin, hands_up, platform_footwear, short_eyebrows, standing, white_skirt, zouri, ass_visible_through_thighs, flat_chest, fox_tail, magatama_necklace, midriff, pleated_skirt, red_footwear, revealing_clothes, shide, sparkle, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, hair_bell, jingle_bell, solo, wide_sleeves, blush, looking_at_viewer, open_mouth, black_thighhighs, long_sleeves, :d, white_dress, white_background, standing, cat_ear_legwear, folding_fan, hair_ribbon, holding_fan, bandages, black_capelet, cat_ears, full_body, paw_print, simple_background, tabi, tassel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | :d | fur_trim | long_sleeves | looking_at_viewer | navel | open_clothes | open_mouth | solo | white_thighhighs | wide_sleeves | blush | claw_pose | hair_bell | jingle_bell | full_body | groin | hands_up | platform_footwear | short_eyebrows | standing | white_skirt | zouri | ass_visible_through_thighs | flat_chest | fox_tail | magatama_necklace | midriff | pleated_skirt | red_footwear | revealing_clothes | shide | sparkle | white_background | black_thighhighs | white_dress | cat_ear_legwear | folding_fan | hair_ribbon | holding_fan | bandages | black_capelet | cat_ears | paw_print | simple_background | tabi | tassel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----|:-----------|:---------------|:--------------------|:--------|:---------------|:-------------|:-------|:-------------------|:---------------|:--------|:------------|:------------|:--------------|:------------|:--------|:-----------|:--------------------|:-----------------|:-----------|:--------------|:--------|:-----------------------------|:-------------|:-----------|:--------------------|:----------|:----------------|:---------------|:--------------------|:--------|:----------|:-------------------|:-------------------|:--------------|:------------------|:--------------|:--------------|:--------------|:-----------|:----------------|:-----------|:------------|:--------------------|:-------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | | | X | X | | X | X | | X | X | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/isokaze_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:46:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:55:52+00:00 |
10b71e621546f899a3936da8c6c267e497985260 | Reginaldocpv/Re | [
"region:us"
] | 2024-01-13T17:46:57+00:00 | {} | 2024-01-13T17:53:58+00:00 |
|
f9662b491077d7c4fb9f8ba75c8f58f1b7dd48e7 | Shikshya/mini-platypus | [
"region:us"
] | 2024-01-13T17:47:49+00:00 | {} | 2024-01-13T17:47:49+00:00 |
|
8545f5f2afcdc20a5319e5182fea5d6068c9cbb3 |
# Dataset Card for Evaluation run of dhanushreddy29/BrokenKeyboard
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dhanushreddy29/BrokenKeyboard](https://huggingface.co/dhanushreddy29/BrokenKeyboard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboard",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:49:35.571074](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboard/blob/main/results_2024-01-13T17-49-35.571074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6639982159937499,
"acc_stderr": 0.03167193035355786,
"acc_norm": 0.6650219715139944,
"acc_norm_stderr": 0.03231409871415513,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7135941358366864,
"mc2_stderr": 0.01506271807008482
},
"harness|arc:challenge|25": {
"acc": 0.6783276450511946,
"acc_stderr": 0.013650488084494162,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266127
},
"harness|hellaswag|10": {
"acc": 0.7103166699860586,
"acc_stderr": 0.0045268830210276325,
"acc_norm": 0.8833897629954193,
"acc_norm_stderr": 0.003202993346991063
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.031709956060406545,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.031709956060406545
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4894179894179894,
"acc_stderr": 0.025745542276045478,
"acc_norm": 0.4894179894179894,
"acc_norm_stderr": 0.025745542276045478
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206858,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206858
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374313,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947077,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947077
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217575,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217575
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445803,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4915254237288136,
"acc_stderr": 0.012768401697269057,
"acc_norm": 0.4915254237288136,
"acc_norm_stderr": 0.012768401697269057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7135941358366864,
"mc2_stderr": 0.01506271807008482
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166734
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445208
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboard | [
"region:us"
] | 2024-01-13T17:51:49+00:00 | {"pretty_name": "Evaluation run of dhanushreddy29/BrokenKeyboard", "dataset_summary": "Dataset automatically created during the evaluation run of model [dhanushreddy29/BrokenKeyboard](https://huggingface.co/dhanushreddy29/BrokenKeyboard) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboard\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:49:35.571074](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboard/blob/main/results_2024-01-13T17-49-35.571074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6639982159937499,\n \"acc_stderr\": 0.03167193035355786,\n \"acc_norm\": 0.6650219715139944,\n \"acc_norm_stderr\": 0.03231409871415513,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7135941358366864,\n \"mc2_stderr\": 0.01506271807008482\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6783276450511946,\n \"acc_stderr\": 0.013650488084494162,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266127\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7103166699860586,\n \"acc_stderr\": 0.0045268830210276325,\n \"acc_norm\": 0.8833897629954193,\n \"acc_norm_stderr\": 0.003202993346991063\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.031709956060406545,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.031709956060406545\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4894179894179894,\n \"acc_stderr\": 0.025745542276045478,\n \"acc_norm\": 0.4894179894179894,\n \"acc_norm_stderr\": 0.025745542276045478\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206858,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206858\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374313,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374313\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.869198312236287,\n \"acc_stderr\": 0.02194876605947077,\n \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947077\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217575,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217575\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445803,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4915254237288136,\n \"acc_stderr\": 0.012768401697269057,\n \"acc_norm\": 0.4915254237288136,\n \"acc_norm_stderr\": 0.012768401697269057\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7135941358366864,\n \"mc2_stderr\": 0.01506271807008482\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166734\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \"acc_stderr\": 0.013197931775445208\n }\n}\n```", "repo_url": "https://huggingface.co/dhanushreddy29/BrokenKeyboard", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-49-35.571074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["**/details_harness|winogrande|5_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-49-35.571074.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_49_35.571074", "path": ["results_2024-01-13T17-49-35.571074.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-49-35.571074.parquet"]}]}]} | 2024-01-13T17:52:10+00:00 |
6c21c803ee61b0ba32a9125c70b53386b231dcc5 | Tsuinzues/macacoaranha | [
"license:openrail",
"region:us"
] | 2024-01-13T17:59:39+00:00 | {"license": "openrail"} | 2024-01-13T17:59:54+00:00 |
|
22b30dfffe94826dc77f6f9a1214475fa36112aa | deepghs/fancaps_index | [
"license:mit",
"region:us"
] | 2024-01-13T18:01:18+00:00 | {"license": "mit"} | 2024-01-13T20:32:55+00:00 |
|
8facd4627dc455fe0b4a5ac68296616e3412538f |
# Dataset Card for Evaluation run of brucethemoose/Yi-34B-200K-DARE-merge-v7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [brucethemoose/Yi-34B-200K-DARE-merge-v7](https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:00:33.123437](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7/blob/main/results_2024-01-13T18-00-33.123437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7681401846074161,
"acc_stderr": 0.02789845201480161,
"acc_norm": 0.7728973251356218,
"acc_norm_stderr": 0.02841712456337832,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5890169683214581,
"mc2_stderr": 0.0152246570119347
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620453,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173304
},
"harness|hellaswag|10": {
"acc": 0.6590320653256323,
"acc_stderr": 0.004730658073041562,
"acc_norm": 0.8598884684325832,
"acc_norm_stderr": 0.003463933286063885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.02389335183446432,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.02389335183446432
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.02694748312149623,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.02694748312149623
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747548,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.023266512213730564,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.023266512213730564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488323,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6699507389162561,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.6699507389162561,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.017646526677233335,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.017646526677233335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.019348070174396985,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.019348070174396985
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02300545944667395,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02300545944667395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5298013245033113,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.5298013245033113,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9321100917431193,
"acc_stderr": 0.010785412654517362,
"acc_norm": 0.9321100917431193,
"acc_norm_stderr": 0.010785412654517362
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9362745098039216,
"acc_stderr": 0.01714392165552496,
"acc_norm": 0.9362745098039216,
"acc_norm_stderr": 0.01714392165552496
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.02647824096048937,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.02647824096048937
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342327,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342327
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.020776761102512982,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.020776761102512982
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7497206703910615,
"acc_stderr": 0.014487500852850412,
"acc_norm": 0.7497206703910615,
"acc_norm_stderr": 0.014487500852850412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.0210295764646627,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.0210295764646627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571853,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6010430247718384,
"acc_stderr": 0.012506757655293679,
"acc_norm": 0.6010430247718384,
"acc_norm_stderr": 0.012506757655293679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.015194153113184729,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.015194153113184729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.02292300409473685,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.02292300409473685
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5890169683214581,
"mc2_stderr": 0.0152246570119347
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838913
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313398
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7 | [
"region:us"
] | 2024-01-13T18:02:46+00:00 | {"pretty_name": "Evaluation run of brucethemoose/Yi-34B-200K-DARE-merge-v7", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/Yi-34B-200K-DARE-merge-v7](https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:00:33.123437](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7/blob/main/results_2024-01-13T18-00-33.123437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7681401846074161,\n \"acc_stderr\": 0.02789845201480161,\n \"acc_norm\": 0.7728973251356218,\n \"acc_norm_stderr\": 0.02841712456337832,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5890169683214581,\n \"mc2_stderr\": 0.0152246570119347\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620453,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173304\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6590320653256323,\n \"acc_stderr\": 0.004730658073041562,\n \"acc_norm\": 0.8598884684325832,\n \"acc_norm_stderr\": 0.003463933286063885\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.02389335183446432,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.02389335183446432\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.02694748312149623,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.02694748312149623\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747548,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.023266512213730564,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.023266512213730564\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488323,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488323\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6699507389162561,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.6699507389162561,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.017646526677233335,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.017646526677233335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396985,\n \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396985\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02300545944667395,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02300545944667395\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9321100917431193,\n \"acc_stderr\": 0.010785412654517362,\n \"acc_norm\": 0.9321100917431193,\n \"acc_norm_stderr\": 0.010785412654517362\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9362745098039216,\n \"acc_stderr\": 0.01714392165552496,\n \"acc_norm\": 0.9362745098039216,\n \"acc_norm_stderr\": 0.01714392165552496\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.02647824096048937,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.02647824096048937\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342327,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342327\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.020776761102512982,\n \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.020776761102512982\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7497206703910615,\n \"acc_stderr\": 0.014487500852850412,\n \"acc_norm\": 0.7497206703910615,\n \"acc_norm_stderr\": 0.014487500852850412\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.0210295764646627,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.0210295764646627\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571853,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6010430247718384,\n \"acc_stderr\": 0.012506757655293679,\n \"acc_norm\": 0.6010430247718384,\n \"acc_norm_stderr\": 0.012506757655293679\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02236867256288675,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02236867256288675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.015194153113184729,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.015194153113184729\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.02292300409473685,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.02292300409473685\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5890169683214581,\n \"mc2_stderr\": 0.0152246570119347\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838913\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \"acc_stderr\": 0.013107179054313398\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-00-33.123437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["**/details_harness|winogrande|5_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-00-33.123437.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_00_33.123437", "path": ["results_2024-01-13T18-00-33.123437.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-00-33.123437.parquet"]}]}]} | 2024-01-13T18:03:08+00:00 |
f92fcd61fe1d1d9fac158e97d273f95eef6a6a66 |
# Dataset Card for Evaluation run of brucethemoose/SUS-Bagel-200K-DARE-Test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [brucethemoose/SUS-Bagel-200K-DARE-Test](https://huggingface.co/brucethemoose/SUS-Bagel-200K-DARE-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:09:57.188193](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test/blob/main/results_2024-01-13T18-09-57.188193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7658118587114254,
"acc_stderr": 0.02808039655994379,
"acc_norm": 0.7696925363139744,
"acc_norm_stderr": 0.02861463324453946,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6119893427851197,
"mc2_stderr": 0.014925989149943244
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620456,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173306
},
"harness|hellaswag|10": {
"acc": 0.6566421031666999,
"acc_stderr": 0.0047385929002801905,
"acc_norm": 0.8538139812786297,
"acc_norm_stderr": 0.003525705773353417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.02426297983937228,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.02426297983937228
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.05000000000000001,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05000000000000001
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7829787234042553,
"acc_stderr": 0.026947483121496217,
"acc_norm": 0.7829787234042553,
"acc_norm_stderr": 0.026947483121496217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.708994708994709,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.708994708994709,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6847290640394089,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.6847290640394089,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865387,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865387
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.01028141701190903,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.01028141701190903
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8128205128205128,
"acc_stderr": 0.01977660108655004,
"acc_norm": 0.8128205128205128,
"acc_norm_stderr": 0.01977660108655004
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.0237933539975288,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.0237933539975288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862088,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862088
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564026,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.01052403107905584,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.01052403107905584
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.020903975842083027,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.020903975842083027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7754189944134078,
"acc_stderr": 0.01395680366654464,
"acc_norm": 0.7754189944134078,
"acc_norm_stderr": 0.01395680366654464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.020823758837580912,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.020823758837580912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.0210295764646627,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.0210295764646627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062072,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062072
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614098,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.605606258148631,
"acc_stderr": 0.01248214166563118,
"acc_norm": 0.605606258148631,
"acc_norm_stderr": 0.01248214166563118
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.023886881922440335,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.023886881922440335
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072867,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072867
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6119893427851197,
"mc2_stderr": 0.014925989149943244
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237433
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131702
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test | [
"region:us"
] | 2024-01-13T18:12:13+00:00 | {"pretty_name": "Evaluation run of brucethemoose/SUS-Bagel-200K-DARE-Test", "dataset_summary": "Dataset automatically created during the evaluation run of model [brucethemoose/SUS-Bagel-200K-DARE-Test](https://huggingface.co/brucethemoose/SUS-Bagel-200K-DARE-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:09:57.188193](https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test/blob/main/results_2024-01-13T18-09-57.188193.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7658118587114254,\n \"acc_stderr\": 0.02808039655994379,\n \"acc_norm\": 0.7696925363139744,\n \"acc_norm_stderr\": 0.02861463324453946,\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6119893427851197,\n \"mc2_stderr\": 0.014925989149943244\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620456,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173306\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6566421031666999,\n \"acc_stderr\": 0.0047385929002801905,\n \"acc_norm\": 0.8538139812786297,\n \"acc_norm_stderr\": 0.003525705773353417\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.02426297983937228,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.02426297983937228\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05000000000000001,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05000000000000001\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7829787234042553,\n \"acc_stderr\": 0.026947483121496217,\n \"acc_norm\": 0.7829787234042553,\n \"acc_norm_stderr\": 0.026947483121496217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7586206896551724,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.7586206896551724,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.708994708994709,\n \"acc_stderr\": 0.023393826500484875,\n \"acc_norm\": 0.708994708994709,\n \"acc_norm_stderr\": 0.023393826500484875\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6847290640394089,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.6847290640394089,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865387,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865387\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.01028141701190903,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.01028141701190903\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8128205128205128,\n \"acc_stderr\": 0.01977660108655004,\n \"acc_norm\": 0.8128205128205128,\n \"acc_norm_stderr\": 0.01977660108655004\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.0237933539975288,\n \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.0237933539975288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862088,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862088\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564026,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564026\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.01052403107905584,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.01052403107905584\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.020903975842083027,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.020903975842083027\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7754189944134078,\n \"acc_stderr\": 0.01395680366654464,\n \"acc_norm\": 0.7754189944134078,\n \"acc_norm_stderr\": 0.01395680366654464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.020823758837580912,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.020823758837580912\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.0210295764646627,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.0210295764646627\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614098,\n \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614098\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.605606258148631,\n \"acc_stderr\": 0.01248214166563118,\n \"acc_norm\": 0.605606258148631,\n \"acc_norm_stderr\": 0.01248214166563118\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.023886881922440335,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.023886881922440335\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072867,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072867\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6119893427851197,\n \"mc2_stderr\": 0.014925989149943244\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237433\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131702\n }\n}\n```", "repo_url": "https://huggingface.co/brucethemoose/SUS-Bagel-200K-DARE-Test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-09-57.188193.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["**/details_harness|winogrande|5_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-09-57.188193.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_09_57.188193", "path": ["results_2024-01-13T18-09-57.188193.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-09-57.188193.parquet"]}]}]} | 2024-01-13T18:12:35+00:00 |
acc6c60ae6638884d14269f7bc0984e43a99cbea |
# Dataset Card for Evaluation run of sequelbox/DiamondForce
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sequelbox/DiamondForce](https://huggingface.co/sequelbox/DiamondForce) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__DiamondForce",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:13:28.839818](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DiamondForce/blob/main/results_2024-01-13T18-13-28.839818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5803032096933994,
"acc_stderr": 0.033300956946307504,
"acc_norm": 0.5859341567740417,
"acc_norm_stderr": 0.03400152897884741,
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068232,
"mc2": 0.46457835926357594,
"mc2_stderr": 0.01513153294586495
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379981,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685967,
"acc_norm": 0.8342959569806812,
"acc_norm_stderr": 0.0037105487209054154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606646,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606646
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502327,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502327
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543688,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593517,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.0258167567915842,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.0258167567915842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302898,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935726,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935726
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024117,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547724,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547724
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3292533659730722,
"mc1_stderr": 0.016451264440068232,
"mc2": 0.46457835926357594,
"mc2_stderr": 0.01513153294586495
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.28658074298711145,
"acc_stderr": 0.012454841668337704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sequelbox__DiamondForce | [
"region:us"
] | 2024-01-13T18:15:48+00:00 | {"pretty_name": "Evaluation run of sequelbox/DiamondForce", "dataset_summary": "Dataset automatically created during the evaluation run of model [sequelbox/DiamondForce](https://huggingface.co/sequelbox/DiamondForce) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__DiamondForce\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:13:28.839818](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__DiamondForce/blob/main/results_2024-01-13T18-13-28.839818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5803032096933994,\n \"acc_stderr\": 0.033300956946307504,\n \"acc_norm\": 0.5859341567740417,\n \"acc_norm_stderr\": 0.03400152897884741,\n \"mc1\": 0.3292533659730722,\n \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.46457835926357594,\n \"mc2_stderr\": 0.01513153294586495\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379981,\n \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n \"acc_stderr\": 0.004819367172685967,\n \"acc_norm\": 0.8342959569806812,\n \"acc_norm_stderr\": 0.0037105487209054154\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606646,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606646\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502327,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502327\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543688,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543688\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593517,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593517\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.0258167567915842,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.0258167567915842\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302898,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935726,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935726\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024117,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024117\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547724,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547724\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n \"mc1_stderr\": 0.016451264440068232,\n \"mc2\": 0.46457835926357594,\n \"mc2_stderr\": 0.01513153294586495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28658074298711145,\n \"acc_stderr\": 0.012454841668337704\n }\n}\n```", "repo_url": "https://huggingface.co/sequelbox/DiamondForce", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-13-28.839818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["**/details_harness|winogrande|5_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-13-28.839818.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_13_28.839818", "path": ["results_2024-01-13T18-13-28.839818.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-13-28.839818.parquet"]}]}]} | 2024-01-13T18:16:09+00:00 |
4125ccc670cbb241b465cc070fb7304923e671e3 |
# Dataset of hatsuzuki/初月/初月 (Azur Lane)
This is the dataset of hatsuzuki/初月/初月 (Azur Lane), containing 41 images and their tags.
The core tags of this character are `black_hair, long_hair, red_eyes, breasts, bangs, red_hair, multicolored_hair, horns, twintails, small_breasts, two-tone_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 76.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 33.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 108 | 77.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 61.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 121.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatsuzuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, detached_sleeves, open_mouth, wide_sleeves, black_pantyhose, bare_shoulders, smile, white_background, cleavage, katana, simple_background, holding_sword, japanese_clothes, skirt |
| 1 | 12 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, navel, solo, black_bikini, thigh_strap, black_choker, open_mouth, open_shirt, sitting, stomach, blush, innertube, white_shirt, collarbone, see-through, simple_background, thighs, water, white_background, :d, barefoot, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | detached_sleeves | open_mouth | wide_sleeves | black_pantyhose | bare_shoulders | smile | white_background | cleavage | katana | simple_background | holding_sword | japanese_clothes | skirt | long_sleeves | navel | black_bikini | thigh_strap | black_choker | open_shirt | sitting | stomach | blush | innertube | white_shirt | collarbone | see-through | thighs | water | :d | barefoot | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-------------|:---------------|:------------------|:-----------------|:--------|:-------------------|:-----------|:---------|:--------------------|:----------------|:-------------------|:--------|:---------------|:--------|:---------------|:--------------|:---------------|:-------------|:----------|:----------|:--------|:------------|:--------------|:-------------|:--------------|:---------|:--------|:-----|:-----------|:-----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | | X | | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hatsuzuki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:24:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:33:49+00:00 |
fc6c64263c59939e0ca5c016bd69ede6cbc77407 |
# Dataset of gridley/グリッドレイ/格里德利 (Azur Lane)
This is the dataset of gridley/グリッドレイ/格里德利 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, bangs, hair_between_eyes, ahoge, long_hair, bow, hair_ornament, two_side_up, drill_hair, red_bow, animal_ears, deer_ears, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 16.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 9.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 14.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 29.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gridley_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gridley_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | blush, 1girl, bare_shoulders, looking_at_viewer, smile, solo, holding, open_mouth, sleeveless, thighhighs, camera, christmas, red_dress, reindeer_antlers, santa_costume, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | bare_shoulders | looking_at_viewer | smile | solo | holding | open_mouth | sleeveless | thighhighs | camera | christmas | red_dress | reindeer_antlers | santa_costume | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:--------------------|:--------|:-------|:----------|:-------------|:-------------|:-------------|:---------|:------------|:------------|:-------------------|:----------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gridley_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:24:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:30:46+00:00 |
96526af11167969874b4cb88ae72ade7bc5f7f03 | SilentAntagonist/csv | [
"region:us"
] | 2024-01-13T18:29:21+00:00 | {} | 2024-01-13T18:29:41+00:00 |
|
3c873a5d9d6dc4fe432bcf7648e02a4fb64bbb59 |
# Dataset Card for Evaluation run of aari1995/germeo-7b-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aari1995/germeo-7b-laser](https://huggingface.co/aari1995/germeo-7b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aari1995__germeo-7b-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:27:49.824954](https://huggingface.co/datasets/open-llm-leaderboard/details_aari1995__germeo-7b-laser/blob/main/results_2024-01-13T18-27-49.824954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6055285169834799,
"acc_stderr": 0.033079665720799664,
"acc_norm": 0.6095438527185658,
"acc_norm_stderr": 0.03374506182230424,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5382753959859625,
"mc2_stderr": 0.01572725969894502
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670728
},
"harness|hellaswag|10": {
"acc": 0.6415056761601274,
"acc_stderr": 0.004785781979354868,
"acc_norm": 0.8281218880701056,
"acc_norm_stderr": 0.003765034286153438
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776482,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776482
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097424,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097424
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154333,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808517,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.043012503996908764,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.043012503996908764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.0251310002336479,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.0251310002336479
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186807,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186807
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190444,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190444
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085634,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072766,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072766
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5382753959859625,
"mc2_stderr": 0.01572725969894502
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.01206892327890819
},
"harness|gsm8k|5": {
"acc": 0.4336618650492798,
"acc_stderr": 0.013650728047064685
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aari1995__germeo-7b-laser | [
"region:us"
] | 2024-01-13T18:30:09+00:00 | {"pretty_name": "Evaluation run of aari1995/germeo-7b-laser", "dataset_summary": "Dataset automatically created during the evaluation run of model [aari1995/germeo-7b-laser](https://huggingface.co/aari1995/germeo-7b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aari1995__germeo-7b-laser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:27:49.824954](https://huggingface.co/datasets/open-llm-leaderboard/details_aari1995__germeo-7b-laser/blob/main/results_2024-01-13T18-27-49.824954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6055285169834799,\n \"acc_stderr\": 0.033079665720799664,\n \"acc_norm\": 0.6095438527185658,\n \"acc_norm_stderr\": 0.03374506182230424,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5382753959859625,\n \"mc2_stderr\": 0.01572725969894502\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670728\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n \"acc_stderr\": 0.004785781979354868,\n \"acc_norm\": 0.8281218880701056,\n \"acc_norm_stderr\": 0.003765034286153438\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776482,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776482\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097424,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097424\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154333,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154333\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.043012503996908764,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.043012503996908764\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.0251310002336479,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.0251310002336479\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941613,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186807,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186807\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.012732398286190444,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.012732398286190444\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085634,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085634\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072766,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072766\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5382753959859625,\n \"mc2_stderr\": 0.01572725969894502\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.01206892327890819\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4336618650492798,\n \"acc_stderr\": 0.013650728047064685\n }\n}\n```", "repo_url": "https://huggingface.co/aari1995/germeo-7b-laser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["**/details_harness|winogrande|5_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-27-49.824954.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_27_49.824954", "path": ["results_2024-01-13T18-27-49.824954.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-27-49.824954.parquet"]}]}]} | 2024-01-13T18:30:31+00:00 |
c28b7769311f1770e05a47054c5fc22275fd797f | hiepdaoquang704/test_vietnamese | [
"region:us"
] | 2024-01-13T18:34:02+00:00 | {"dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4049969, "num_examples": 1000}], "download_size": 2141778, "dataset_size": 4049969}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T18:34:04+00:00 |
|
48a338f806088c9430e36f608e81a7818004399d |
# Dataset of blucher/ブリュッヒャー/布吕歇尔 (Azur Lane)
This is the dataset of blucher/ブリュッヒャー/布吕歇尔 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `long_hair, blonde_hair, red_eyes, breasts, ahoge, bangs, twintails, fang, large_breasts, skin_fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 60.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 31.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 104 | 72.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 52.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 104 | 104.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blucher_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blucher_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | smile, 1girl, looking_at_viewer, solo, open_mouth, blush, black_gloves, red_scarf, red_skirt, black_thighhighs, fingerless_gloves, white_background, hair_between_eyes, plaid_skirt, simple_background, pleated_skirt |
| 1 | 7 |  |  |  |  |  | 1girl, bodysuit, goggles_on_head, looking_at_viewer, smile, solo, ass, fake_tail, long_sleeves, official_alternate_costume, rabbit_tail, sideboob, cropped_jacket, open_mouth, white_jacket, bandaid_on_face, blush, from_behind, blue_sky, day, full_body, medium_breasts, outdoors, shoes, snow, white_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | smile | 1girl | looking_at_viewer | solo | open_mouth | blush | black_gloves | red_scarf | red_skirt | black_thighhighs | fingerless_gloves | white_background | hair_between_eyes | plaid_skirt | simple_background | pleated_skirt | bodysuit | goggles_on_head | ass | fake_tail | long_sleeves | official_alternate_costume | rabbit_tail | sideboob | cropped_jacket | white_jacket | bandaid_on_face | from_behind | blue_sky | day | full_body | medium_breasts | outdoors | shoes | snow | white_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------|:--------|:---------------|:------------|:------------|:-------------------|:--------------------|:-------------------|:--------------------|:--------------|:--------------------|:----------------|:-----------|:------------------|:------|:------------|:---------------|:-----------------------------|:--------------|:-----------|:-----------------|:---------------|:------------------|:--------------|:-----------|:------|:------------|:-----------------|:-----------|:--------|:-------|:---------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/blucher_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:43:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:52:07+00:00 |
cdee76c11f4d6feb0379bda51c287820b8741bb6 |
# Dataset of oklahoma/オクラホマ/俄克拉荷马 (Azur Lane)
This is the dataset of oklahoma/オクラホマ/俄克拉荷马 (Azur Lane), containing 28 images and their tags.
The core tags of this character are `ahoge, blue_eyes, breasts, hair_between_eyes, blonde_hair, short_hair, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 34.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 18.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 39.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 30.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 61.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oklahoma_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oklahoma_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, detached_sleeves, open_mouth, hat, simple_background, :d, boots, brown_gloves, white_background, brown_skirt, cleavage_cutout, long_sleeves, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | detached_sleeves | open_mouth | hat | simple_background | :d | boots | brown_gloves | white_background | brown_skirt | cleavage_cutout | long_sleeves | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-------------|:------|:--------------------|:-----|:--------|:---------------|:-------------------|:--------------|:------------------|:---------------|:-----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/oklahoma_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:43:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:51:35+00:00 |
148a663f07bffb3e1d963185be89795740a3209d |
# Dataset of mccall/マッコール/麦考尔 (Azur Lane)
This is the dataset of mccall/マッコール/麦考尔 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `blue_eyes, hair_ornament, long_hair, ahoge, pink_hair, star_hair_ornament, twintails, low_twintails, hairclip, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 8.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mccall_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mccall_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 12.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mccall_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 7.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mccall_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 15.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mccall_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mccall_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, blush, star_(symbol), popsicle, looking_at_viewer, solo, holding, shoes, short_sleeves, white_background, dress, sailor_collar |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | star_(symbol) | popsicle | looking_at_viewer | solo | holding | shoes | short_sleeves | white_background | dress | sailor_collar |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------------|:-----------|:--------------------|:-------|:----------|:--------|:----------------|:-------------------|:--------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mccall_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:43:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T18:46:48+00:00 |
b20a0a097e3c9e213353cf3789d9b319c5a16049 |
# Dataset of kiev/キエフ/基辅 (Azur Lane)
This is the dataset of kiev/キエフ/基辅 (Azur Lane), containing 69 images and their tags.
The core tags of this character are `long_hair, red_eyes, twintails, breasts, hair_bun, hair_over_one_eye, cone_hair_bun, white_hair, very_long_hair, small_breasts, hat, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 124.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiev_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 69 | 58.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiev_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 135.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiev_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 69 | 105.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiev_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 214.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiev_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kiev_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, bare_shoulders, solo, white_dress, looking_at_viewer, fur-trimmed_dress, very_long_sleeves, one_eye_covered, pom_pom_hair_ornament, criss-cross_halter, cleavage, fur_hat, thighhighs, white_headwear, standing, sleeves_past_wrists |
| 1 | 23 |  |  |  |  |  | looking_at_viewer, 1girl, official_alternate_costume, bare_shoulders, solo, one_eye_covered, dress, elbow_gloves, black_gloves, blush, navel_cutout, ribbon, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | white_dress | looking_at_viewer | fur-trimmed_dress | very_long_sleeves | one_eye_covered | pom_pom_hair_ornament | criss-cross_halter | cleavage | fur_hat | thighhighs | white_headwear | standing | sleeves_past_wrists | official_alternate_costume | dress | elbow_gloves | black_gloves | blush | navel_cutout | ribbon | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:--------------------|:--------------------|:--------------------|:------------------|:------------------------|:---------------------|:-----------|:----------|:-------------|:-----------------|:-----------|:----------------------|:-----------------------------|:--------|:---------------|:---------------|:--------|:---------------|:---------|:--------------------|:-------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/kiev_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T18:43:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:03:09+00:00 |
4c0b735c958960fdf24145d59dda705a1b6d40b5 | zhusdika/phone_calls | [
"region:us"
] | 2024-01-13T18:43:39+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20598597.0, "num_examples": 2}, {"name": "test", "num_bytes": 4921255.0, "num_examples": 1}], "download_size": 22337287, "dataset_size": 25519852.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-19T09:13:23+00:00 |
|
80a29c70125fa171275be8d74eddfccb6ad657f3 | Carlosgg14/trunksdofuturo | [
"license:openrail",
"region:us"
] | 2024-01-13T18:50:05+00:00 | {"license": "openrail"} | 2024-01-13T19:15:21+00:00 |
|
29e0ad7a719f3baa7116b379734fd1f5cfd6f8a7 |
This is a filtered version of Philip May's German paraphrase dataset.
The dataset has been filtered for the sake of convenience, since smaller devices do not support such large files.
All text pairs in the dataset are paraphrases, and are therefore labelled 1. As such, the dataset is well-suited for use in conjunction with the multiple negatives ranking loss.
As the original author suggests, the dataset has been filtered, mostly following the guidelines set by the author. Any row that doesn't comply with the following conditions was filtered out:
- min_char_len < 25
- de_token_count > 30
- en_de_token_count > 30
- jaccard_similarity > 0.3
- cos_sim < 0.9
## Licensing
Copyright (c) 2022 [Philip May](https://may.la/), [Deutsche Telekom AG](https://www.telekom.com/).
This work is licensed under [CC-BY-SA 4.0]([https://link-url-here.org](https://creativecommons.org/licenses/by-sa/4.0/). | danielheinz/telekom-backtrans-paraphrase-filtered | [
"task_categories:feature-extraction",
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:de",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-01-13T18:51:54+00:00 | {"language": ["de"], "license": "cc-by-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["feature-extraction", "text-classification"]} | 2024-01-13T22:01:06+00:00 |
a6594457a6546e8ae0a2757fff970bd2ac977f2a |
# Dataset Card for Evaluation run of Unbabel/TowerBase-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Unbabel/TowerBase-7B-v0.1](https://huggingface.co/Unbabel/TowerBase-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:50:27.460863](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1/blob/main/results_2024-01-13T18-50-27.460863.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4376530652603909,
"acc_stderr": 0.03443246082169724,
"acc_norm": 0.4418549088967034,
"acc_norm_stderr": 0.035215274463337505,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3729251403943506,
"mc2_stderr": 0.013360710032144346
},
"harness|arc:challenge|25": {
"acc": 0.48464163822525597,
"acc_stderr": 0.014604496129394906,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5780720971917944,
"acc_stderr": 0.004928578106026371,
"acc_norm": 0.7768372834096794,
"acc_norm_stderr": 0.0041551563175093375
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.432258064516129,
"acc_stderr": 0.02818173972001941,
"acc_norm": 0.432258064516129,
"acc_norm_stderr": 0.02818173972001941
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431856,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431856
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.03517739796373131,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.03517739796373131
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6110091743119266,
"acc_stderr": 0.020902300887392873,
"acc_norm": 0.6110091743119266,
"acc_norm_stderr": 0.020902300887392873
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353603,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5485232067510548,
"acc_stderr": 0.0323936001739747,
"acc_norm": 0.5485232067510548,
"acc_norm_stderr": 0.0323936001739747
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6452991452991453,
"acc_stderr": 0.03134250486245402,
"acc_norm": 0.6452991452991453,
"acc_norm_stderr": 0.03134250486245402
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5900383141762452,
"acc_stderr": 0.017587672312336045,
"acc_norm": 0.5900383141762452,
"acc_norm_stderr": 0.017587672312336045
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47109826589595377,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.47109826589595377,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089768,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5659163987138264,
"acc_stderr": 0.0281502322445356,
"acc_norm": 0.5659163987138264,
"acc_norm_stderr": 0.0281502322445356
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3272490221642764,
"acc_stderr": 0.011983819806464752,
"acc_norm": 0.3272490221642764,
"acc_norm_stderr": 0.011983819806464752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.41013071895424835,
"acc_stderr": 0.019898412717635903,
"acc_norm": 0.41013071895424835,
"acc_norm_stderr": 0.019898412717635903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3729251403943506,
"mc2_stderr": 0.013360710032144346
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404676
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.009298499235587877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1 | [
"region:us"
] | 2024-01-13T18:52:49+00:00 | {"pretty_name": "Evaluation run of Unbabel/TowerBase-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Unbabel/TowerBase-7B-v0.1](https://huggingface.co/Unbabel/TowerBase-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:50:27.460863](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1/blob/main/results_2024-01-13T18-50-27.460863.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4376530652603909,\n \"acc_stderr\": 0.03443246082169724,\n \"acc_norm\": 0.4418549088967034,\n \"acc_norm_stderr\": 0.035215274463337505,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3729251403943506,\n \"mc2_stderr\": 0.013360710032144346\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.48464163822525597,\n \"acc_stderr\": 0.014604496129394906,\n \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285012\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5780720971917944,\n \"acc_stderr\": 0.004928578106026371,\n \"acc_norm\": 0.7768372834096794,\n \"acc_norm_stderr\": 0.0041551563175093375\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.432258064516129,\n \"acc_stderr\": 0.02818173972001941,\n \"acc_norm\": 0.432258064516129,\n \"acc_norm_stderr\": 0.02818173972001941\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431856,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431856\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.03517739796373131,\n \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.03517739796373131\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885117,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885117\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6110091743119266,\n \"acc_stderr\": 0.020902300887392873,\n \"acc_norm\": 0.6110091743119266,\n \"acc_norm_stderr\": 0.020902300887392873\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524866,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524866\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5485232067510548,\n \"acc_stderr\": 0.0323936001739747,\n \"acc_norm\": 0.5485232067510548,\n \"acc_norm_stderr\": 0.0323936001739747\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6452991452991453,\n \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.6452991452991453,\n \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5900383141762452,\n \"acc_stderr\": 0.017587672312336045,\n \"acc_norm\": 0.5900383141762452,\n \"acc_norm_stderr\": 0.017587672312336045\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.47109826589595377,\n \"acc_stderr\": 0.02687408588351835,\n \"acc_norm\": 0.47109826589595377,\n \"acc_norm_stderr\": 0.02687408588351835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089768,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089768\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n \"acc_stderr\": 0.0281502322445356,\n \"acc_norm\": 0.5659163987138264,\n \"acc_norm_stderr\": 0.0281502322445356\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.02774431344337654,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02774431344337654\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3272490221642764,\n \"acc_stderr\": 0.011983819806464752,\n \"acc_norm\": 0.3272490221642764,\n \"acc_norm_stderr\": 0.011983819806464752\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.41013071895424835,\n \"acc_stderr\": 0.019898412717635903,\n \"acc_norm\": 0.41013071895424835,\n \"acc_norm_stderr\": 0.019898412717635903\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3729251403943506,\n \"mc2_stderr\": 0.013360710032144346\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \"acc_stderr\": 0.009298499235587877\n }\n}\n```", "repo_url": "https://huggingface.co/Unbabel/TowerBase-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-50-27.460863.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["**/details_harness|winogrande|5_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-50-27.460863.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_50_27.460863", "path": ["results_2024-01-13T18-50-27.460863.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-50-27.460863.parquet"]}]}]} | 2024-01-13T18:53:09+00:00 |
b47679c2c04734ea5423eec85c3583834173b4cd |
# Dataset Card for Evaluation run of aihub-app/zyte-1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aihub-app/zyte-1B](https://huggingface.co/aihub-app/zyte-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aihub-app__zyte-1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:52:20.951527](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__zyte-1B/blob/main/results_2024-01-13T18-52-20.951527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2535595202605755,
"acc_stderr": 0.030560550793178157,
"acc_norm": 0.2546044559605402,
"acc_norm_stderr": 0.031312162600645795,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4214033627668609,
"mc2_stderr": 0.01468478270821933
},
"harness|arc:challenge|25": {
"acc": 0.34726962457337884,
"acc_stderr": 0.013913034529620434,
"acc_norm": 0.378839590443686,
"acc_norm_stderr": 0.014175915490000324
},
"harness|hellaswag|10": {
"acc": 0.4567815176259709,
"acc_stderr": 0.004971106265046556,
"acc_norm": 0.6137223660625374,
"acc_norm_stderr": 0.0048590041846946225
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1791907514450867,
"acc_stderr": 0.02924251305906328,
"acc_norm": 0.1791907514450867,
"acc_norm_stderr": 0.02924251305906328
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.0291012906983867,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.0291012906983867
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.034302659784856984,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.034302659784856984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333338,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333338
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1967741935483871,
"acc_stderr": 0.022616409420742018,
"acc_norm": 0.1967741935483871,
"acc_norm_stderr": 0.022616409420742018
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.028501378167893946,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.028501378167893946
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803624,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803624
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.021992016662370547,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.021992016662370547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.025195752251823796,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.025195752251823796
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868952,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.029331162294251728,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.029331162294251728
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3273542600896861,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.3273542600896861,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.027046857630716677,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.027046857630716677
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.016328814422102055,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.016328814422102055
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545526,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545526
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.014835616582882578,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.014835616582882578
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982476,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.02492672322484554,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.02492672322484554
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913222,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.024127463462650135,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.024127463462650135
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4214033627668609,
"mc2_stderr": 0.01468478270821933
},
"harness|winogrande|5": {
"acc": 0.6195737963693765,
"acc_stderr": 0.013644727908656831
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369795
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aihub-app__zyte-1B | [
"region:us"
] | 2024-01-13T18:54:15+00:00 | {"pretty_name": "Evaluation run of aihub-app/zyte-1B", "dataset_summary": "Dataset automatically created during the evaluation run of model [aihub-app/zyte-1B](https://huggingface.co/aihub-app/zyte-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aihub-app__zyte-1B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:52:20.951527](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__zyte-1B/blob/main/results_2024-01-13T18-52-20.951527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2535595202605755,\n \"acc_stderr\": 0.030560550793178157,\n \"acc_norm\": 0.2546044559605402,\n \"acc_norm_stderr\": 0.031312162600645795,\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4214033627668609,\n \"mc2_stderr\": 0.01468478270821933\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34726962457337884,\n \"acc_stderr\": 0.013913034529620434,\n \"acc_norm\": 0.378839590443686,\n \"acc_norm_stderr\": 0.014175915490000324\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4567815176259709,\n \"acc_stderr\": 0.004971106265046556,\n \"acc_norm\": 0.6137223660625374,\n \"acc_norm_stderr\": 0.0048590041846946225\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1791907514450867,\n \"acc_stderr\": 0.02924251305906328,\n \"acc_norm\": 0.1791907514450867,\n \"acc_norm_stderr\": 0.02924251305906328\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.0291012906983867,\n \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.0291012906983867\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.034302659784856984,\n \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.034302659784856984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727772,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727772\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333338,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333338\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1967741935483871,\n \"acc_stderr\": 0.022616409420742018,\n \"acc_norm\": 0.1967741935483871,\n \"acc_norm_stderr\": 0.022616409420742018\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.028501378167893946,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.028501378167893946\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803624,\n \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803624\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.021992016662370547,\n \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.021992016662370547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.025195752251823796,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.025195752251823796\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868952,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868952\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251728,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251728\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598028,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598028\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3273542600896861,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.3273542600896861,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.027046857630716677,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.027046857630716677\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.016328814422102055,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.016328814422102055\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545526,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545526\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n \"acc_stderr\": 0.014835616582882578,\n \"acc_norm\": 0.26927374301675977,\n \"acc_norm_stderr\": 0.014835616582882578\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982476,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982476\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n \"acc_stderr\": 0.02492672322484554,\n \"acc_norm\": 0.2604501607717042,\n \"acc_norm_stderr\": 0.02492672322484554\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913222,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913222\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.024127463462650135,\n \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.024127463462650135\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4214033627668609,\n \"mc2_stderr\": 0.01468478270821933\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6195737963693765,\n \"acc_stderr\": 0.013644727908656831\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369795\n }\n}\n```", "repo_url": "https://huggingface.co/aihub-app/zyte-1B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-52-20.951527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["**/details_harness|winogrande|5_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-52-20.951527.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_52_20.951527", "path": ["results_2024-01-13T18-52-20.951527.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-52-20.951527.parquet"]}]}]} | 2024-01-13T18:54:37+00:00 |
b54fd84b7d613994a9d7383237910c2b95204c3b |
# Dataset Card for Evaluation run of rombodawg/Open_Gpt4_8x7B_v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rombodawg/Open_Gpt4_8x7B_v0.2](https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:56:10.033721](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2/blob/main/results_2024-01-13T18-56-10.033721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7188157275221039,
"acc_stderr": 0.030029707306740233,
"acc_norm": 0.7225114431475408,
"acc_norm_stderr": 0.03061684137993921,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7191590734021742,
"mc2_stderr": 0.014814881257041205
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.01379618294778556,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623496
},
"harness|hellaswag|10": {
"acc": 0.6761601274646485,
"acc_stderr": 0.0046698341309770785,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.0034463307489637123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123377,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123377
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093288,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093288
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.723404255319149,
"acc_stderr": 0.02924188386962882,
"acc_norm": 0.723404255319149,
"acc_norm_stderr": 0.02924188386962882
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.0256993528321318,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.0256993528321318
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.01946933458648693,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.01946933458648693
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682253,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682253
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.025435119438105364,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.025435119438105364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364432,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364432
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458265,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458265
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911899,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911899
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.046161430750285455,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.046161430750285455
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867457,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867457
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196124,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5687150837988827,
"acc_stderr": 0.01656382939904771,
"acc_norm": 0.5687150837988827,
"acc_norm_stderr": 0.01656382939904771
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340866,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5436766623207301,
"acc_stderr": 0.012721420501462547,
"acc_norm": 0.5436766623207301,
"acc_norm_stderr": 0.012721420501462547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767708,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767708
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.017035229258034038,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.017035229258034038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.02116621630465939,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.02116621630465939
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015574,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015574
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.7191590734021742,
"mc2_stderr": 0.014814881257041205
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.0104108497752228
},
"harness|gsm8k|5": {
"acc": 0.5913570887035633,
"acc_stderr": 0.013540639733342429
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2 | [
"region:us"
] | 2024-01-13T18:58:32+00:00 | {"pretty_name": "Evaluation run of rombodawg/Open_Gpt4_8x7B_v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [rombodawg/Open_Gpt4_8x7B_v0.2](https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:56:10.033721](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Open_Gpt4_8x7B_v0.2/blob/main/results_2024-01-13T18-56-10.033721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7188157275221039,\n \"acc_stderr\": 0.030029707306740233,\n \"acc_norm\": 0.7225114431475408,\n \"acc_norm_stderr\": 0.03061684137993921,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7191590734021742,\n \"mc2_stderr\": 0.014814881257041205\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623496\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6761601274646485,\n \"acc_stderr\": 0.0046698341309770785,\n \"acc_norm\": 0.8615813582951604,\n \"acc_norm_stderr\": 0.0034463307489637123\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123377,\n \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123377\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093288,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093288\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367405,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367405\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.723404255319149,\n \"acc_stderr\": 0.02924188386962882,\n \"acc_norm\": 0.723404255319149,\n \"acc_norm_stderr\": 0.02924188386962882\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.0256993528321318,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.0256993528321318\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n \"acc_stderr\": 0.01946933458648693,\n \"acc_norm\": 0.864516129032258,\n \"acc_norm_stderr\": 0.01946933458648693\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682253,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682253\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105364,\n \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364432,\n \"acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364432\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8774509803921569,\n \"acc_stderr\": 0.023015389732458265,\n \"acc_norm\": 0.8774509803921569,\n \"acc_norm_stderr\": 0.023015389732458265\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.7443946188340808,\n \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911899,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911899\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.046161430750285455,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.046161430750285455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867457,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867457\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196124,\n \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5687150837988827,\n \"acc_stderr\": 0.01656382939904771,\n \"acc_norm\": 0.5687150837988827,\n \"acc_norm_stderr\": 0.01656382939904771\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340866,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5436766623207301,\n \"acc_stderr\": 0.012721420501462547,\n \"acc_norm\": 0.5436766623207301,\n \"acc_norm_stderr\": 0.012721420501462547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767708,\n \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767708\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.017035229258034038,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.017035229258034038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015574,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015574\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.7191590734021742,\n \"mc2_stderr\": 0.014814881257041205\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.0104108497752228\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5913570887035633,\n \"acc_stderr\": 0.013540639733342429\n }\n}\n```", "repo_url": "https://huggingface.co/rombodawg/Open_Gpt4_8x7B_v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["**/details_harness|winogrande|5_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-56-10.033721.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_56_10.033721", "path": ["results_2024-01-13T18-56-10.033721.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-56-10.033721.parquet"]}]}]} | 2024-01-13T18:58:52+00:00 |
ac604b735bafa8f3c1b7ab602daf7de7fc2fb7ee |
# Dataset Card for Evaluation run of udkai/Turdus
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [udkai/Turdus](https://huggingface.co/udkai/Turdus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_udkai__Turdus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:57:41.292260](https://huggingface.co/datasets/open-llm-leaderboard/details_udkai__Turdus/blob/main/results_2024-01-13T18-57-41.292260.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6516181926648874,
"acc_stderr": 0.0321728872347043,
"acc_norm": 0.650729026842337,
"acc_norm_stderr": 0.03285551278950848,
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6711400532546105,
"mc2_stderr": 0.015451181249566945
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393443,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523197
},
"harness|hellaswag|10": {
"acc": 0.7206731726747659,
"acc_stderr": 0.004477514681328155,
"acc_norm": 0.8855805616411073,
"acc_norm_stderr": 0.0031766945645110784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933712,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933712
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.016635838341631928,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.016635838341631928
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5471236230110159,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6711400532546105,
"mc2_stderr": 0.015451181249566945
},
"harness|winogrande|5": {
"acc": 0.8666140489344909,
"acc_stderr": 0.00955544802642297
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851805
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_udkai__Turdus | [
"region:us"
] | 2024-01-13T19:00:00+00:00 | {"pretty_name": "Evaluation run of udkai/Turdus", "dataset_summary": "Dataset automatically created during the evaluation run of model [udkai/Turdus](https://huggingface.co/udkai/Turdus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_udkai__Turdus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:57:41.292260](https://huggingface.co/datasets/open-llm-leaderboard/details_udkai__Turdus/blob/main/results_2024-01-13T18-57-41.292260.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6516181926648874,\n \"acc_stderr\": 0.0321728872347043,\n \"acc_norm\": 0.650729026842337,\n \"acc_norm_stderr\": 0.03285551278950848,\n \"mc1\": 0.5471236230110159,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6711400532546105,\n \"mc2_stderr\": 0.015451181249566945\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393443,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523197\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7206731726747659,\n \"acc_stderr\": 0.004477514681328155,\n \"acc_norm\": 0.8855805616411073,\n \"acc_norm_stderr\": 0.0031766945645110784\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933712,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933712\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.016635838341631928,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.016635838341631928\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5471236230110159,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6711400532546105,\n \"mc2_stderr\": 0.015451181249566945\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8666140489344909,\n \"acc_stderr\": 0.00955544802642297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.012880360794851805\n }\n}\n```", "repo_url": "https://huggingface.co/udkai/Turdus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-57-41.292260.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["**/details_harness|winogrande|5_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-57-41.292260.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_57_41.292260", "path": ["results_2024-01-13T18-57-41.292260.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-57-41.292260.parquet"]}]}]} | 2024-01-13T19:00:22+00:00 |
55da3294b5612884cc24ba75c5ad1ce5d42b6455 |
# Dataset Card for Evaluation run of Unbabel/TowerInstruct-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Unbabel/TowerInstruct-7B-v0.1](https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:58:50.073000](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1/blob/main/results_2024-01-13T18-58-50.073000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4711217152311766,
"acc_stderr": 0.03442367854889606,
"acc_norm": 0.4757369265971281,
"acc_norm_stderr": 0.03519302105112233,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.42594704830683766,
"mc2_stderr": 0.014921954316600566
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.014607794914013048,
"acc_norm": 0.5546075085324232,
"acc_norm_stderr": 0.014523987638344076
},
"harness|hellaswag|10": {
"acc": 0.5993825931089425,
"acc_stderr": 0.004890221012015062,
"acc_norm": 0.789982075283808,
"acc_norm_stderr": 0.004064885496003441
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415325,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415325
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380763,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344927,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.648786717752235,
"acc_stderr": 0.017069982051499434,
"acc_norm": 0.648786717752235,
"acc_norm_stderr": 0.017069982051499434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.0285803410651383,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.0285803410651383
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35853976531942633,
"acc_stderr": 0.012248487319682741,
"acc_norm": 0.35853976531942633,
"acc_norm_stderr": 0.012248487319682741
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.020036393768352635,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.020036393768352635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.03171752824062663,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.03171752824062663
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.42594704830683766,
"mc2_stderr": 0.014921954316600566
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998295
},
"harness|gsm8k|5": {
"acc": 0.1645185746777862,
"acc_stderr": 0.010212173002763541
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1 | [
"region:us"
] | 2024-01-13T19:01:12+00:00 | {"pretty_name": "Evaluation run of Unbabel/TowerInstruct-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Unbabel/TowerInstruct-7B-v0.1](https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T18:58:50.073000](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1/blob/main/results_2024-01-13T18-58-50.073000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4711217152311766,\n \"acc_stderr\": 0.03442367854889606,\n \"acc_norm\": 0.4757369265971281,\n \"acc_norm_stderr\": 0.03519302105112233,\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.42594704830683766,\n \"mc2_stderr\": 0.014921954316600566\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.014607794914013048,\n \"acc_norm\": 0.5546075085324232,\n \"acc_norm_stderr\": 0.014523987638344076\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5993825931089425,\n \"acc_stderr\": 0.004890221012015062,\n \"acc_norm\": 0.789982075283808,\n \"acc_norm_stderr\": 0.004064885496003441\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415325,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415325\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380763,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380763\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.030351527323344927,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.030351527323344927\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n \"acc_stderr\": 0.017069982051499434,\n \"acc_norm\": 0.648786717752235,\n \"acc_norm_stderr\": 0.017069982051499434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756646,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756646\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.0285803410651383,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.0285803410651383\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35853976531942633,\n \"acc_stderr\": 0.012248487319682741,\n \"acc_norm\": 0.35853976531942633,\n \"acc_norm_stderr\": 0.012248487319682741\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.020036393768352635,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.020036393768352635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.03171752824062663,\n \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.03171752824062663\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.42594704830683766,\n \"mc2_stderr\": 0.014921954316600566\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998295\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1645185746777862,\n \"acc_stderr\": 0.010212173002763541\n }\n}\n```", "repo_url": "https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["**/details_harness|winogrande|5_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T18-58-50.073000.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T18_58_50.073000", "path": ["results_2024-01-13T18-58-50.073000.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T18-58-50.073000.parquet"]}]}]} | 2024-01-13T19:01:33+00:00 |
52c44e7cfffccf7046f3d15ae5ae02c6e250fc80 | zaydzuhri/the_pile_tokenized_5percent_truncated | [
"region:us"
] | 2024-01-13T19:02:02+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 26575097543, "num_examples": 6000000}], "download_size": 8682781157, "dataset_size": 26575097543}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T19:36:11+00:00 |
|
6a00d474a0d63e05d23a6c78e5dfc265ee200512 | senhorsapo/yor | [
"license:openrail",
"region:us"
] | 2024-01-13T19:02:15+00:00 | {"license": "openrail"} | 2024-01-13T19:03:22+00:00 |
|
13a0791d4b4a9dc4afe3f1f30d88897cd24b72a3 | VietTung04/MovieReviewsData | [
"task_categories:text-classification",
"language:en",
"license:wtfpl",
"region:us"
] | 2024-01-13T19:02:17+00:00 | {"language": ["en"], "license": "wtfpl", "task_categories": ["text-classification"]} | 2024-01-13T19:16:54+00:00 |
|
ac0a48532abc3075e5375cc5881c8dd320bc30c5 | senhorsapo/loide | [
"license:openrail",
"region:us"
] | 2024-01-13T19:02:30+00:00 | {"license": "openrail"} | 2024-01-13T19:03:02+00:00 |
|
ac7cfa81287914454c536d5bbd6a7883344d68f7 | senhorsapo/jjwezzy | [
"license:openrail",
"region:us"
] | 2024-01-13T19:09:05+00:00 | {"license": "openrail"} | 2024-01-13T19:10:47+00:00 |
|
78ec9607b8d7ef271f33426ee7f962f74c1e500f | version-control/arrayblow-2.7 | [
"region:us"
] | 2024-01-13T19:09:07+00:00 | {"dataset_info": {"features": [{"name": "repo_name", "dtype": "string"}, {"name": "hexsha", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "file_path", "dtype": "string"}, {"name": "api_extract", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4815003, "num_examples": 305}, {"name": "test", "num_bytes": 1379473, "num_examples": 151}], "download_size": 1972734, "dataset_size": 6194476}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-13T20:35:51+00:00 |
|
37a6d0afc220232518aeaef81a6524d91ffc4728 | maxmyn/wholesome_simple_greentext_133k | [
"region:us"
] | 2024-01-13T19:11:23+00:00 | {"dataset_info": {"features": [{"name": "greentexts", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 17090474, "num_examples": 133442}], "download_size": 10465468, "dataset_size": 17090474}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T14:37:05+00:00 |
|
5c1485e1a744e61683df71aabc0e4616c511bbb7 | RenatoCaos/novomoreno | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T19:17:21+00:00 | {"license": "apache-2.0"} | 2024-01-13T19:17:21+00:00 |
|
234d5e3bb2c0f12d9190596e3d34efbc69b81800 |
# Dataset of georgia/ジョージア/佐治亚 (Azur Lane)
This is the dataset of georgia/ジョージア/佐治亚 (Azur Lane), containing 35 images and their tags.
The core tags of this character are `breasts, blue_eyes, earrings, black_hair, large_breasts, bangs, heterochromia, hair_ornament, yellow_eyes, long_hair, hair_between_eyes, star_earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 48.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georgia_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 26.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georgia_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 76 | 49.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georgia_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 42.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georgia_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 76 | 73.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georgia_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/georgia_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, cleavage, jewelry, simple_background, solo, looking_at_viewer, smile, white_background, brown_hair, choker, gloves, star_(symbol), upper_body |
| 1 | 6 |  |  |  |  |  | 1girl, ahoge, arm_strap, black_bikini, eyewear_on_head, jewelry, looking_at_viewer, solo, star_(symbol), sunglasses, short_hair_with_long_locks, bare_shoulders, navel, side-tie_bikini_bottom, simple_background, smile, choker, leg_ribbon, open_mouth, tankini, thigh_strap, white_background |
| 2 | 9 |  |  |  |  |  | 1girl, solo, cleavage, gloves, jewelry, thigh_strap, rigging, skirt, looking_at_viewer, short_hair, strapless, thighs, turret, asymmetrical_legwear, boots, cannon, full_body, single_thighhigh |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | jewelry | simple_background | solo | looking_at_viewer | smile | white_background | brown_hair | choker | gloves | star_(symbol) | upper_body | ahoge | arm_strap | black_bikini | eyewear_on_head | sunglasses | short_hair_with_long_locks | bare_shoulders | navel | side-tie_bikini_bottom | leg_ribbon | open_mouth | tankini | thigh_strap | rigging | skirt | short_hair | strapless | thighs | turret | asymmetrical_legwear | boots | cannon | full_body | single_thighhigh |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------|:--------------------|:-------|:--------------------|:--------|:-------------------|:-------------|:---------|:---------|:----------------|:-------------|:--------|:------------|:---------------|:------------------|:-------------|:-----------------------------|:-----------------|:--------|:-------------------------|:-------------|:-------------|:----------|:--------------|:----------|:--------|:-------------|:------------|:---------|:---------|:-----------------------|:--------|:---------|:------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | X | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | X | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/georgia_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:32:12+00:00 |
084170f0f049a0b56181ab510950d58897f7be5b |
# Dataset of jade/ヤーデ/亚德 (Azur Lane)
This is the dataset of jade/ヤーデ/亚德 (Azur Lane), containing 46 images and their tags.
The core tags of this character are `breasts, blue_eyes, bangs, grey_hair, hair_bun, hair_ornament, large_breasts, short_hair, hair_between_eyes, hairclip, hat, double_bun, mole`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 46 | 92.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 46 | 43.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 117 | 93.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 46 | 78.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 117 | 153.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jade_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jade_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, popsicle, sailor_collar, bracelet, white_one-piece_swimsuit, blush, bare_shoulders, innertube, water, covered_navel, holding, looking_back, smile |
| 1 | 25 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, smile, blush, long_sleeves, white_background, simple_background, white_gloves, black_headwear, thigh_strap, black_dress, skirt, cross, mole_under_eye |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | popsicle | sailor_collar | bracelet | white_one-piece_swimsuit | blush | bare_shoulders | innertube | water | covered_navel | holding | looking_back | smile | cleavage | long_sleeves | white_background | simple_background | white_gloves | black_headwear | thigh_strap | black_dress | skirt | cross | mole_under_eye |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:----------------|:-----------|:---------------------------|:--------|:-----------------|:------------|:--------|:----------------|:----------|:---------------|:--------|:-----------|:---------------|:-------------------|:--------------------|:---------------|:-----------------|:--------------|:--------------|:--------|:--------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/jade_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:33:01+00:00 |
4c3ef8819ca8e9b45fdd2bc427753678e3302820 |
# Dataset of odin/オーディン/奥丁 (Azur Lane)
This is the dataset of odin/オーディン/奥丁 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `multicolored_hair, white_hair, red_hair, blue_eyes, long_hair, streaked_hair, hair_over_one_eye, hat, peaked_cap, military_hat, black_headwear, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 70.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 35.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 74.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 61.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 111.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/odin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, solo, looking_at_viewer, black_coat, iron_cross, breastplate, sword, open_coat, holding, sheath |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | black_coat | iron_cross | breastplate | sword | open_coat | holding | sheath |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:-------------|:--------------|:--------|:------------|:----------|:---------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/odin_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:30:27+00:00 |
611a973fc613cd15ba77ca0bed72afa8a3e6fc3a |
# Dataset of aoba/青葉/青叶 (Azur Lane)
This is the dataset of aoba/青葉/青叶 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `long_hair, bangs, aqua_hair, breasts, brown_eyes, animal_ears, twintails, medium_breasts, blue_hair, earrings, hair_between_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 8.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 5.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 19 | 9.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 7.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 19 | 13.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aoba_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, smile, solo, cleavage, looking_at_viewer, open_jacket, pleated_skirt, black_shirt, simple_background, white_skirt, black_jacket, blush, collarbone, white_background, holding_pen, jewelry, miniskirt, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | cleavage | looking_at_viewer | open_jacket | pleated_skirt | black_shirt | simple_background | white_skirt | black_jacket | blush | collarbone | white_background | holding_pen | jewelry | miniskirt | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-----------|:--------------------|:--------------|:----------------|:--------------|:--------------------|:--------------|:---------------|:--------|:-------------|:-------------------|:--------------|:----------|:------------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aoba_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:24:55+00:00 |
b5b97f1c58eefb1baa7335fe117a2a3c7d17282c |
# Dataset of comet/コメット/彗星 (Azur Lane)
This is the dataset of comet/コメット/彗星 (Azur Lane), containing 34 images and their tags.
The core tags of this character are `green_hair, long_hair, red_eyes, twintails, ahoge, hat, bangs, beret, breasts, hair_between_eyes, hair_ornament, white_headwear, ribbon, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 41.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 25.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 77 | 50.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 36.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 77 | 67.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/comet_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/comet_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | smile, 1girl, solo, open_mouth, star_(symbol), blush, choker, looking_at_viewer, puffy_sleeves, white_thighhighs, blue_skirt, plaid_skirt, white_shirt, long_sleeves, collared_shirt, one_eye_closed, ;d, hair_ribbon, pleated_skirt, retrofit_(azur_lane), white_background |
| 1 | 5 |  |  |  |  |  | 2girls, smile, 1girl, blush, looking_at_viewer, open_mouth, solo_focus, blonde_hair, thighhighs, collarbone, one_eye_closed, skirt, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | smile | 1girl | solo | open_mouth | star_(symbol) | blush | choker | looking_at_viewer | puffy_sleeves | white_thighhighs | blue_skirt | plaid_skirt | white_shirt | long_sleeves | collared_shirt | one_eye_closed | ;d | hair_ribbon | pleated_skirt | retrofit_(azur_lane) | white_background | 2girls | solo_focus | blonde_hair | thighhighs | collarbone | skirt | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:----------------|:--------|:---------|:--------------------|:----------------|:-------------------|:-------------|:--------------|:--------------|:---------------|:-----------------|:-----------------|:-----|:--------------|:----------------|:-----------------------|:-------------------|:---------|:-------------|:--------------|:-------------|:-------------|:--------|:-----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | | X | | | | | | X | X | X | X | X | X | X |
| CyberHarem/comet_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:31:04+00:00 |
8a11bf5b81e75a4ae201136e7a2d9112f46dc2b4 |
# Dataset of joffre/ジョッフル/霞飞 (Azur Lane)
This is the dataset of joffre/ジョッフル/霞飞 (Azur Lane), containing 73 images and their tags.
The core tags of this character are `breasts, twintails, large_breasts, hair_ornament, bangs, red_eyes, grey_hair, long_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 135.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 65.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 190 | 151.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 114.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 190 | 230.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/joffre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, cleavage, holding_sword, looking_at_viewer, solo, white_dress, black_gloves, black_choker, fingerless_gloves, white_thighhighs, wide_sleeves, feathered_wings, juliet_sleeves, simple_background, black_wings, medium_breasts, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_dress, black_gloves, cleavage, white_thighhighs, fingerless_gloves, parted_lips, black_choker, juliet_sleeves, thighs, medium_breasts, wide_sleeves |
| 2 | 5 |  |  |  |  |  | 1girl, blue_dress, cleavage, crown, hair_bow, looking_at_viewer, sitting, solo, white_thighhighs, pink_eyes, bare_shoulders, black_bow, blue_footwear, butterfly, garter_straps, high_heels, red_cape, wrist_cuffs, ass, detached_sleeves, frills, hair_between_eyes, official_alternate_costume, parted_lips, puffy_sleeves, sidelocks, simple_background, smile, white_background |
| 3 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, black_gloves, penis, solo_focus, bar_censor, fingerless_gloves, sex, vaginal, black_choker, breasts_out, cross-section, cum_in_pussy, cum_on_breasts, grabbing, internal_cumshot, purple_eyes, sweat, uterus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | holding_sword | looking_at_viewer | solo | white_dress | black_gloves | black_choker | fingerless_gloves | white_thighhighs | wide_sleeves | feathered_wings | juliet_sleeves | simple_background | black_wings | medium_breasts | white_background | parted_lips | thighs | blue_dress | crown | hair_bow | sitting | pink_eyes | bare_shoulders | black_bow | blue_footwear | butterfly | garter_straps | high_heels | red_cape | wrist_cuffs | ass | detached_sleeves | frills | hair_between_eyes | official_alternate_costume | puffy_sleeves | sidelocks | smile | 1boy | blush | hetero | nipples | open_mouth | penis | solo_focus | bar_censor | sex | vaginal | breasts_out | cross-section | cum_in_pussy | cum_on_breasts | grabbing | internal_cumshot | purple_eyes | sweat | uterus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------------|:--------------------|:-------|:--------------|:---------------|:---------------|:--------------------|:-------------------|:---------------|:------------------|:-----------------|:--------------------|:--------------|:-----------------|:-------------------|:--------------|:---------|:-------------|:--------|:-----------|:----------|:------------|:-----------------|:------------|:----------------|:------------|:----------------|:-------------|:-----------|:--------------|:------|:-------------------|:---------|:--------------------|:-----------------------------|:----------------|:------------|:--------|:-------|:--------|:---------|:----------|:-------------|:--------|:-------------|:-------------|:------|:----------|:--------------|:----------------|:---------------|:-----------------|:-----------|:-------------------|:--------------|:--------|:---------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | | | | | X | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/joffre_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:21:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:43:06+00:00 |
9b1572cfeb6fe002d66d9509996143c2f51fb6c9 | EbubeJohnEnyi/Q_and_A | [
"region:us"
] | 2024-01-13T19:23:44+00:00 | {} | 2024-01-13T19:24:32+00:00 |
|
b80d4a52237e6edd9b59045bc5b7aeeb4bebf175 |
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-14B-Llamafied
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-14B-Llamafied](https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T19:31:00.889052](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied/blob/main/results_2024-01-13T19-31-00.889052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576627342855309,
"acc_stderr": 0.032068617008274285,
"acc_norm": 0.661971967843754,
"acc_norm_stderr": 0.03270262072736983,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560156985268844,
"mc2_stderr": 0.014814301713999594
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.014532011498211676
},
"harness|hellaswag|10": {
"acc": 0.6353316072495518,
"acc_stderr": 0.004803533333364225,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.003807680331172903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106134,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6758620689655173,
"acc_stderr": 0.03900432069185554,
"acc_norm": 0.6758620689655173,
"acc_norm_stderr": 0.03900432069185554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5502645502645502,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.5502645502645502,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678192,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678192
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246571,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246571
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530368,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530368
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875192,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875192
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879915,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879915
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472662,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560156985268844,
"mc2_stderr": 0.014814301713999594
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
},
"harness|gsm8k|5": {
"acc": 0.5276724791508719,
"acc_stderr": 0.013751375538801326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied | [
"region:us"
] | 2024-01-13T19:33:11+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Qwen-14B-Llamafied", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-14B-Llamafied](https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T19:31:00.889052](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied/blob/main/results_2024-01-13T19-31-00.889052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576627342855309,\n \"acc_stderr\": 0.032068617008274285,\n \"acc_norm\": 0.661971967843754,\n \"acc_norm_stderr\": 0.03270262072736983,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560156985268844,\n \"mc2_stderr\": 0.014814301713999594\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.014532011498211676\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n \"acc_stderr\": 0.004803533333364225,\n \"acc_norm\": 0.8231428002389962,\n \"acc_norm_stderr\": 0.003807680331172903\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106134,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596241,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596241\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6758620689655173,\n \"acc_stderr\": 0.03900432069185554,\n \"acc_norm\": 0.6758620689655173,\n \"acc_norm_stderr\": 0.03900432069185554\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5502645502645502,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.5502645502645502,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215286,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678192,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678192\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246571,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246571\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530368,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530368\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875192,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875192\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879915,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879915\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n \"acc_stderr\": 0.012761104871472662,\n \"acc_norm\": 0.4810951760104302,\n \"acc_norm_stderr\": 0.012761104871472662\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.018492596536396955,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.018492596536396955\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560156985268844,\n \"mc2_stderr\": 0.014814301713999594\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5276724791508719,\n \"acc_stderr\": 0.013751375538801326\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["**/details_harness|winogrande|5_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T19-31-00.889052.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T19_31_00.889052", "path": ["results_2024-01-13T19-31-00.889052.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T19-31-00.889052.parquet"]}]}]} | 2024-01-13T19:33:31+00:00 |
a436cd5c0411bd63566fbe1150c965e80f174e1c | Novin-AI/INST-LStyle | [
"region:us"
] | 2024-01-13T19:38:03+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 49387145, "num_examples": 35117}], "download_size": 22545255, "dataset_size": 49387145}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T19:38:26+00:00 |
|
dd21cf0adfabfb17f1d2278ffbde8fe1a537c3d7 |
# Dataset Card for Evaluation run of KnutJaegersberg/internlm-20b-llamafied
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/internlm-20b-llamafied](https://huggingface.co/KnutJaegersberg/internlm-20b-llamafied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__internlm-20b-llamafied",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T19:39:44.590825](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__internlm-20b-llamafied/blob/main/results_2024-01-13T19-39-44.590825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2529697662773502,
"acc_stderr": 0.03077338700338214,
"acc_norm": 0.2544077864541873,
"acc_norm_stderr": 0.031593813415922045,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301145,
"mc2": 0.4805606031451568,
"mc2_stderr": 0.016999605402858272
},
"harness|arc:challenge|25": {
"acc": 0.21928327645051193,
"acc_stderr": 0.012091245787615723,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.012942030195136423
},
"harness|hellaswag|10": {
"acc": 0.25542720573590916,
"acc_stderr": 0.004352098082984431,
"acc_norm": 0.26399123680541725,
"acc_norm_stderr": 0.004398937225038417
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.03547854198560826,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.03547854198560826
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745087,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745087
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.17446808510638298,
"acc_stderr": 0.024809442335503973,
"acc_norm": 0.17446808510638298,
"acc_norm_stderr": 0.024809442335503973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031103,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031103
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30808080808080807,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.30808080808080807,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232273,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.01841528635141641,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.01841528635141641
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693244,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693244
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17040358744394618,
"acc_stderr": 0.025234593447136165,
"acc_norm": 0.17040358744394618,
"acc_norm_stderr": 0.025234593447136165
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1581196581196581,
"acc_stderr": 0.023902325549560392,
"acc_norm": 0.1581196581196581,
"acc_norm_stderr": 0.023902325549560392
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321505,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.020903975842083027,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.020903975842083027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2245810055865922,
"acc_stderr": 0.01395680366654464,
"acc_norm": 0.2245810055865922,
"acc_norm_stderr": 0.01395680366654464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729474,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729474
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218895,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218895
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114024,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114024
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2653194263363755,
"acc_stderr": 0.011276198843958878,
"acc_norm": 0.2653194263363755,
"acc_norm_stderr": 0.011276198843958878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.027576468622740522,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.027576468622740522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.017077373377857016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.017077373377857016
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2816326530612245,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.2816326530612245,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301145,
"mc2": 0.4805606031451568,
"mc2_stderr": 0.016999605402858272
},
"harness|winogrande|5": {
"acc": 0.47829518547750594,
"acc_stderr": 0.01403923921648463
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__internlm-20b-llamafied | [
"region:us"
] | 2024-01-13T19:42:00+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/internlm-20b-llamafied", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/internlm-20b-llamafied](https://huggingface.co/KnutJaegersberg/internlm-20b-llamafied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__internlm-20b-llamafied\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T19:39:44.590825](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__internlm-20b-llamafied/blob/main/results_2024-01-13T19-39-44.590825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2529697662773502,\n \"acc_stderr\": 0.03077338700338214,\n \"acc_norm\": 0.2544077864541873,\n \"acc_norm_stderr\": 0.031593813415922045,\n \"mc1\": 0.2215422276621787,\n \"mc1_stderr\": 0.014537867601301145,\n \"mc2\": 0.4805606031451568,\n \"mc2_stderr\": 0.016999605402858272\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21928327645051193,\n \"acc_stderr\": 0.012091245787615723,\n \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.012942030195136423\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25542720573590916,\n \"acc_stderr\": 0.004352098082984431,\n \"acc_norm\": 0.26399123680541725,\n \"acc_norm_stderr\": 0.004398937225038417\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.03547854198560826,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.03547854198560826\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.03456425745087,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.03456425745087\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.17446808510638298,\n \"acc_stderr\": 0.024809442335503973,\n \"acc_norm\": 0.17446808510638298,\n \"acc_norm_stderr\": 0.024809442335503973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031103,\n \"acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031103\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.30808080808080807,\n \"acc_stderr\": 0.03289477330098616,\n \"acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.03289477330098616\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147602,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147602\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232273,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232273\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24403669724770644,\n \"acc_stderr\": 0.01841528635141641,\n \"acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.01841528635141641\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693244,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693244\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159267,\n \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159267\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n \"acc_stderr\": 0.025234593447136165,\n \"acc_norm\": 0.17040358744394618,\n \"acc_norm_stderr\": 0.025234593447136165\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.21487603305785125,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1581196581196581,\n \"acc_stderr\": 0.023902325549560392,\n \"acc_norm\": 0.1581196581196581,\n \"acc_norm_stderr\": 0.023902325549560392\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.01586624307321505,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.01586624307321505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.020903975842083027,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.020903975842083027\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2245810055865922,\n \"acc_stderr\": 0.01395680366654464,\n \"acc_norm\": 0.2245810055865922,\n \"acc_norm_stderr\": 0.01395680366654464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729474,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729474\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.02567025924218895,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.02567025924218895\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596729,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596729\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114024,\n \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114024\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2653194263363755,\n \"acc_stderr\": 0.011276198843958878,\n \"acc_norm\": 0.2653194263363755,\n \"acc_norm_stderr\": 0.011276198843958878\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.027576468622740522,\n \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.027576468622740522\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.017077373377857016,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.017077373377857016\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.036942843353378,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.036942843353378\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2816326530612245,\n \"acc_stderr\": 0.028795185574291286,\n \"acc_norm\": 0.2816326530612245,\n \"acc_norm_stderr\": 0.028795185574291286\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n \"mc1_stderr\": 0.014537867601301145,\n \"mc2\": 0.4805606031451568,\n \"mc2_stderr\": 0.016999605402858272\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.47829518547750594,\n \"acc_stderr\": 0.01403923921648463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/internlm-20b-llamafied", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-39-44.590825.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["**/details_harness|winogrande|5_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T19-39-44.590825.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T19_39_44.590825", "path": ["results_2024-01-13T19-39-44.590825.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T19-39-44.590825.parquet"]}]}]} | 2024-01-13T19:42:21+00:00 |
a32266c24a2ec57996f9d9da4e209b3260e4c3b5 |
# Dataset of akatsuki/暁/晓 (Azur Lane)
This is the dataset of akatsuki/暁/晓 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `black_hair, long_hair, ponytail, bangs, red_eyes, hair_between_eyes, breasts, eyepatch, high_ponytail, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 13.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akatsuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akatsuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 40 | 19.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akatsuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 12.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akatsuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 40 | 22.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akatsuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akatsuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, mask, scarf, simple_background, white_background, elbow_gloves, fingerless_gloves, full_body, midriff, ninja, weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | mask | scarf | simple_background | white_background | elbow_gloves | fingerless_gloves | full_body | midriff | ninja | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------|:--------|:--------------------|:-------------------|:---------------|:--------------------|:------------|:----------|:--------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/akatsuki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:42:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T19:52:34+00:00 |
008fd30fa61d32baa2411a1f5817578f38d711a2 |
# Dataset of duca_degli_abruzzi/ドゥーカ・デッリ・アブルッツィ/阿布鲁齐公爵 (Azur Lane)
This is the dataset of duca_degli_abruzzi/ドゥーカ・デッリ・アブルッツィ/阿布鲁齐公爵 (Azur Lane), containing 83 images and their tags.
The core tags of this character are `breasts, red_eyes, large_breasts, bangs, halo, long_hair, pink_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 83 | 149.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/duca_degli_abruzzi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 83 | 67.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/duca_degli_abruzzi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 216 | 150.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/duca_degli_abruzzi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 83 | 124.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/duca_degli_abruzzi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 216 | 232.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/duca_degli_abruzzi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/duca_degli_abruzzi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | epaulettes, long_sleeves, looking_at_viewer, rabbit_ears, single_mechanical_arm, 1girl, black_gloves, black_jacket, cleavage, fake_animal_ears, holding, navel, thighs, black_panties, garter_straps, parted_lips, red_nails, stomach, tail, white_background, 2girls, arm_up, black_bowtie, black_footwear, brown_hair, fishnet_thighhighs, frills, full_body, hair_between_eyes, high_heels, highleg, jewelry, kneeling, red_hair, simple_background, skindentation, solo_focus, standing |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, bowtie, detached_collar, playboy_bunny, rabbit_ears, wrist_cuffs, cleavage, looking_at_viewer, red_nails, side_drill, side_ponytail, solo, white_background, earrings, fake_animal_ears, fishnet_pantyhose, nail_polish, simple_background, sitting, bare_legs, black_bow, black_leotard, brown_hair, drinking_glass, holding_tray, parted_lips, single_elbow_glove, single_glove, skindentation |
| 2 | 15 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, single_mechanical_arm, closed_mouth, official_alternate_costume, prosthetic_arm, smile, bare_shoulders, collarbone, sitting, thighs, black_one-piece_swimsuit, jewelry |
| 3 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blue_sky, navel, outdoors, prosthetic_arm, single_mechanical_arm, thighs, water, white_bikini, collarbone, side-tie_bikini_bottom, cleavage, cloud, day, nail_polish, parted_lips, smile, standing, wet, white_choker, arm_up, red_nails, wading |
| 4 | 9 |  |  |  |  |  | 1girl, black_gloves, italian_flag, pantyhose, red_necktie, red_skirt, single_mechanical_arm, solo, dress, drill_locks, sideboob, single_elbow_glove, standing, brown_hair, green_cape, holding, thigh_strap, prosthetic_arm, simple_background, white_background, drill_hair, looking_at_viewer, armpits, high_heels, medium_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | epaulettes | long_sleeves | looking_at_viewer | rabbit_ears | single_mechanical_arm | 1girl | black_gloves | black_jacket | cleavage | fake_animal_ears | holding | navel | thighs | black_panties | garter_straps | parted_lips | red_nails | stomach | tail | white_background | 2girls | arm_up | black_bowtie | black_footwear | brown_hair | fishnet_thighhighs | frills | full_body | hair_between_eyes | high_heels | highleg | jewelry | kneeling | red_hair | simple_background | skindentation | solo_focus | standing | bare_shoulders | bowtie | detached_collar | playboy_bunny | wrist_cuffs | side_drill | side_ponytail | solo | earrings | fishnet_pantyhose | nail_polish | sitting | bare_legs | black_bow | black_leotard | drinking_glass | holding_tray | single_elbow_glove | single_glove | closed_mouth | official_alternate_costume | prosthetic_arm | smile | collarbone | black_one-piece_swimsuit | blue_sky | outdoors | water | white_bikini | side-tie_bikini_bottom | cloud | day | wet | white_choker | wading | italian_flag | pantyhose | red_necktie | red_skirt | dress | drill_locks | sideboob | green_cape | thigh_strap | drill_hair | armpits | medium_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------|:---------------|:--------------------|:--------------|:------------------------|:--------|:---------------|:---------------|:-----------|:-------------------|:----------|:--------|:---------|:----------------|:----------------|:--------------|:------------|:----------|:-------|:-------------------|:---------|:---------|:---------------|:-----------------|:-------------|:---------------------|:---------|:------------|:--------------------|:-------------|:----------|:----------|:-----------|:-----------|:--------------------|:----------------|:-------------|:-----------|:-----------------|:---------|:------------------|:----------------|:--------------|:-------------|:----------------|:-------|:-----------|:--------------------|:--------------|:----------|:------------|:------------|:----------------|:-----------------|:---------------|:---------------------|:---------------|:---------------|:-----------------------------|:-----------------|:--------|:-------------|:---------------------------|:-----------|:-----------|:--------|:---------------|:-------------------------|:--------|:------|:------|:---------------|:---------|:---------------|:------------|:--------------|:------------|:--------|:--------------|:-----------|:-------------|:--------------|:-------------|:----------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | | | X | X | | X | | | X | X | | | | | | X | X | | | X | | | | | X | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | | | X | | X | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | X | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | | | X | | X | X | | | X | | | X | X | | | X | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | | | X | | X | X | X | | | | X | | | | | | | | | X | | | | | X | | | | | X | | | | | X | | | X | | | | | | | | X | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/duca_degli_abruzzi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T19:42:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:12:49+00:00 |
1f7711b82b6ce33a61bbcda1bbd164b0baac3d49 | Tsuinzues/kai | [
"license:openrail",
"region:us"
] | 2024-01-13T19:43:55+00:00 | {"license": "openrail"} | 2024-01-13T19:47:39+00:00 |
|
bc019f2161b17717909c197a65c4f153a81ffaa2 |
# Dataset Card for Evaluation run of FelixChao/WizardDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WizardDolphin-7B](https://huggingface.co/FelixChao/WizardDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WizardDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T19:47:12.026725](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WizardDolphin-7B/blob/main/results_2024-01-13T19-47-12.026725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6312916506690491,
"acc_stderr": 0.0324258954325278,
"acc_norm": 0.6317815176886508,
"acc_norm_stderr": 0.03308456506657342,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5927990044155668,
"mc2_stderr": 0.01547758043423419
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.6707827126070504,
"acc_stderr": 0.004689685978155169,
"acc_norm": 0.8585939055964947,
"acc_norm_stderr": 0.003477278544493499
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.037242495958177295,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.037242495958177295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.01672268452620014,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.01672268452620014
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257806,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.01599564494729924,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.01599564494729924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304335,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304335
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5927990044155668,
"mc2_stderr": 0.01547758043423419
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345391
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222088
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__WizardDolphin-7B | [
"region:us"
] | 2024-01-13T19:49:31+00:00 | {"pretty_name": "Evaluation run of FelixChao/WizardDolphin-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/WizardDolphin-7B](https://huggingface.co/FelixChao/WizardDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WizardDolphin-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T19:47:12.026725](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WizardDolphin-7B/blob/main/results_2024-01-13T19-47-12.026725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6312916506690491,\n \"acc_stderr\": 0.0324258954325278,\n \"acc_norm\": 0.6317815176886508,\n \"acc_norm_stderr\": 0.03308456506657342,\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5927990044155668,\n \"mc2_stderr\": 0.01547758043423419\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840055\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n \"acc_stderr\": 0.004689685978155169,\n \"acc_norm\": 0.8585939055964947,\n \"acc_norm_stderr\": 0.003477278544493499\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620014,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620014\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257806,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257806\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n \"acc_stderr\": 0.01599564494729924,\n \"acc_norm\": 0.3541899441340782,\n \"acc_norm_stderr\": 0.01599564494729924\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304335,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304335\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5927990044155668,\n \"mc2_stderr\": 0.01547758043423419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345391\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \"acc_stderr\": 0.013023665136222088\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/WizardDolphin-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["**/details_harness|winogrande|5_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T19-47-12.026725.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T19_47_12.026725", "path": ["results_2024-01-13T19-47-12.026725.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T19-47-12.026725.parquet"]}]}]} | 2024-01-13T19:49:52+00:00 |
72966708101cd769e071feb21a8b3063891c9a9f | odunola/testing | [
"region:us"
] | 2024-01-13T19:56:27+00:00 | {"dataset_info": {"features": [{"name": "language", "dtype": {"class_label": {"names": {"0": "en", "1": "de", "2": "fr", "3": "es", "4": "pl", "5": "it", "6": "ro", "7": "hu", "8": "cs", "9": "nl", "10": "fi", "11": "hr", "12": "sk", "13": "sl", "14": "et", "15": "lt", "16": "en_accented"}}}}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "normalized_text", "dtype": "string"}, {"name": "duration", "dtype": "float64"}, {"name": "english_text", "dtype": "string"}, {"name": "english_audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 12320488.0, "num_examples": 15}], "download_size": 10302008, "dataset_size": 12320488.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T20:19:46+00:00 |
|
e09c49fc8ae86dd217fa937797121e37806a9605 | NPHardEval/NPHardEval-results | [
"license:mit",
"region:us"
] | 2024-01-13T19:59:38+00:00 | {"license": "mit"} | 2024-01-19T21:58:03+00:00 |
|
5169be7df0f045b742772f56edb7f8fc5ae78366 | biauser/bonito | [
"license:openrail",
"region:us"
] | 2024-01-13T20:01:16+00:00 | {"license": "openrail"} | 2024-01-13T20:05:50+00:00 |
|
73c67f3d707834cefd475813094387c024b4d8ee | alphalm/gt1_8kElo_all | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T20:02:44+00:00 | {"license": "apache-2.0"} | 2024-01-13T20:05:16+00:00 |
|
ebd3abaf74f0b5f651e7ade146db34dc1aaf1c17 |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.5](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:06:30.676415](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.5/blob/main/results_2024-01-13T20-06-30.676415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6602010739871594,
"acc_stderr": 0.031725005763123176,
"acc_norm": 0.6605298747796313,
"acc_norm_stderr": 0.032373589088471474,
"mc1": 0.4259485924112607,
"mc1_stderr": 0.017310471904076544,
"mc2": 0.5911635736956555,
"mc2_stderr": 0.015563030300185875
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158289,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623494
},
"harness|hellaswag|10": {
"acc": 0.6836287592113125,
"acc_stderr": 0.0046410920014252925,
"acc_norm": 0.8644692292372037,
"acc_norm_stderr": 0.003415900722381889
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.034765996075164785,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.034765996075164785
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554956,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.02380763319865727,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.02380763319865727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.842911877394636,
"acc_stderr": 0.013012459322650717,
"acc_norm": 0.842911877394636,
"acc_norm_stderr": 0.013012459322650717
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.01602939447489489,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.01602939447489489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518015,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02767846864214472,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02767846864214472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4259485924112607,
"mc1_stderr": 0.017310471904076544,
"mc2": 0.5911635736956555,
"mc2_stderr": 0.015563030300185875
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.01109979664592053
},
"harness|gsm8k|5": {
"acc": 0.7119029567854435,
"acc_stderr": 0.012474469737197923
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.5 | [
"region:us"
] | 2024-01-13T20:08:58+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.5](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:06:30.676415](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.5/blob/main/results_2024-01-13T20-06-30.676415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6602010739871594,\n \"acc_stderr\": 0.031725005763123176,\n \"acc_norm\": 0.6605298747796313,\n \"acc_norm_stderr\": 0.032373589088471474,\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.017310471904076544,\n \"mc2\": 0.5911635736956555,\n \"mc2_stderr\": 0.015563030300185875\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158289,\n \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623494\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6836287592113125,\n \"acc_stderr\": 0.0046410920014252925,\n \"acc_norm\": 0.8644692292372037,\n \"acc_norm_stderr\": 0.003415900722381889\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.02380763319865727,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.02380763319865727\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.842911877394636,\n \"acc_stderr\": 0.013012459322650717,\n \"acc_norm\": 0.842911877394636,\n \"acc_norm_stderr\": 0.013012459322650717\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3575418994413408,\n \"acc_stderr\": 0.01602939447489489,\n \"acc_norm\": 0.3575418994413408,\n \"acc_norm_stderr\": 0.01602939447489489\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02767846864214472,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02767846864214472\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4259485924112607,\n \"mc1_stderr\": 0.017310471904076544,\n \"mc2\": 0.5911635736956555,\n \"mc2_stderr\": 0.015563030300185875\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.01109979664592053\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7119029567854435,\n \"acc_stderr\": 0.012474469737197923\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-06-30.676415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["**/details_harness|winogrande|5_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-06-30.676415.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_06_30.676415", "path": ["results_2024-01-13T20-06-30.676415.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-06-30.676415.parquet"]}]}]} | 2024-01-13T20:09:20+00:00 |
c5c581546c579f6da6cee946175ef75e6ac6b239 |
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-3T-Cinder-v1.2](https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:09:00.533513](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.2/blob/main/results_2024-01-13T20-09-00.533513.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26585070023086527,
"acc_stderr": 0.03098462149675041,
"acc_norm": 0.26789993726374506,
"acc_norm_stderr": 0.03179290872310728,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.3678388335979262,
"mc2_stderr": 0.01427296099022099
},
"harness|arc:challenge|25": {
"acc": 0.310580204778157,
"acc_stderr": 0.013522292098053059,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156215
},
"harness|hellaswag|10": {
"acc": 0.4314877514439355,
"acc_stderr": 0.004942716091996078,
"acc_norm": 0.5651264688309102,
"acc_norm_stderr": 0.00494727245422622
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343602,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343602
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838742,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838742
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.14,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3212121212121212,
"acc_stderr": 0.036462049632538136,
"acc_norm": 0.3212121212121212,
"acc_norm_stderr": 0.036462049632538136
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.03257714077709661,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.03257714077709661
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.023991500500313033,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.023991500500313033
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.02564947026588919,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.02564947026588919
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.03149328104507957,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.03149328104507957
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.01609530296987857,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.01609530296987857
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261436,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261436
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818798,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.02577001564429039,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.02577001564429039
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113912,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113912
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.02962466358115969,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.02962466358115969
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727735,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727735
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538422,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538422
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.3678388335979262,
"mc2_stderr": 0.01427296099022099
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056472
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.2 | [
"region:us"
] | 2024-01-13T20:10:51+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-3T-Cinder-v1.2](https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:09:00.533513](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.2/blob/main/results_2024-01-13T20-09-00.533513.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26585070023086527,\n \"acc_stderr\": 0.03098462149675041,\n \"acc_norm\": 0.26789993726374506,\n \"acc_norm_stderr\": 0.03179290872310728,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.3678388335979262,\n \"mc2_stderr\": 0.01427296099022099\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.310580204778157,\n \"acc_stderr\": 0.013522292098053059,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156215\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4314877514439355,\n \"acc_stderr\": 0.004942716091996078,\n \"acc_norm\": 0.5651264688309102,\n \"acc_norm_stderr\": 0.00494727245422622\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343602,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343602\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838742,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838742\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3212121212121212,\n \"acc_stderr\": 0.036462049632538136,\n \"acc_norm\": 0.3212121212121212,\n \"acc_norm_stderr\": 0.036462049632538136\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.03257714077709661,\n \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.03257714077709661\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.023991500500313033,\n \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.023991500500313033\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.02564947026588919,\n \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.02564947026588919\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.03149328104507957,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.03149328104507957\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.30493273542600896,\n \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n \"acc_stderr\": 0.01609530296987857,\n \"acc_norm\": 0.2822477650063857,\n \"acc_norm_stderr\": 0.01609530296987857\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261436,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261436\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n \"acc_stderr\": 0.025922371788818798,\n \"acc_norm\": 0.2958199356913183,\n \"acc_norm_stderr\": 0.025922371788818798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n \"acc_stderr\": 0.010976425013113912,\n \"acc_norm\": 0.24445893089960888,\n \"acc_norm_stderr\": 0.010976425013113912\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.02962466358115969,\n \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.02962466358115969\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.040139645540727735,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.040139645540727735\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538422,\n \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538422\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.3678388335979262,\n \"mc2_stderr\": 0.01427296099022099\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056472\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225187\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-09-00.533513.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["**/details_harness|winogrande|5_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-09-00.533513.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_09_00.533513", "path": ["results_2024-01-13T20-09-00.533513.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-09-00.533513.parquet"]}]}]} | 2024-01-13T20:11:12+00:00 |
c37b87507d0b16a59786e36ad71ed72582c432ff | lab42/refcocog-v3 | [
"region:us"
] | 2024-01-13T20:11:57+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 12166045998.104, "num_examples": 21899}, {"name": "validation", "num_bytes": 719199051.8, "num_examples": 1300}, {"name": "test", "num_bytes": 1466905798.4, "num_examples": 2600}], "download_size": 4890607002, "dataset_size": 14352150848.303999}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-13T20:17:15+00:00 |
|
c5cfa10a5c05d4a9d90745a3b5e90e7e553c9292 |
# Dataset Card for Evaluation run of NeverSleep/Noromaid-7B-0.4-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7B-0.4-DPO](https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Noromaid-7B-0.4-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:13:17.595813](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7B-0.4-DPO/blob/main/results_2024-01-13T20-13-17.595813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6276281202486842,
"acc_stderr": 0.032369463494806044,
"acc_norm": 0.6354200747096772,
"acc_norm_stderr": 0.033039898413677445,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4227934173655964,
"mc2_stderr": 0.014275177541071271
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427006,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192603
},
"harness|hellaswag|10": {
"acc": 0.6459868552081258,
"acc_stderr": 0.004772358395130453,
"acc_norm": 0.8431587333200558,
"acc_norm_stderr": 0.0036290784658809666
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462843,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462843
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611576,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812145,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.01271384597235898,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.01271384597235898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506637,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.4227934173655964,
"mc2_stderr": 0.014275177541071271
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836675
},
"harness|gsm8k|5": {
"acc": 0.25473843821076575,
"acc_stderr": 0.012001731232879126
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeverSleep__Noromaid-7B-0.4-DPO | [
"region:us"
] | 2024-01-13T20:12:28+00:00 | {"pretty_name": "Evaluation run of NeverSleep/Noromaid-7B-0.4-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7B-0.4-DPO](https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Noromaid-7B-0.4-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:13:17.595813](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7B-0.4-DPO/blob/main/results_2024-01-13T20-13-17.595813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6276281202486842,\n \"acc_stderr\": 0.032369463494806044,\n \"acc_norm\": 0.6354200747096772,\n \"acc_norm_stderr\": 0.033039898413677445,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4227934173655964,\n \"mc2_stderr\": 0.014275177541071271\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427006,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192603\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6459868552081258,\n \"acc_stderr\": 0.004772358395130453,\n \"acc_norm\": 0.8431587333200558,\n \"acc_norm_stderr\": 0.0036290784658809666\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462843,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462843\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611576,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611576\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n \"acc_stderr\": 0.014968772435812145,\n \"acc_norm\": 0.2770949720670391,\n \"acc_norm_stderr\": 0.014968772435812145\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.01271384597235898,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.01271384597235898\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506637,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.4227934173655964,\n \"mc2_stderr\": 0.014275177541071271\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25473843821076575,\n \"acc_stderr\": 0.012001731232879126\n }\n}\n```", "repo_url": "https://huggingface.co/NeverSleep/Noromaid-7B-0.4-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-10-12.408839.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-17.595813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["**/details_harness|winogrande|5_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["**/details_harness|winogrande|5_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-13-17.595813.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_10_12.408839", "path": ["results_2024-01-13T20-10-12.408839.parquet"]}, {"split": "2024_01_13T20_13_17.595813", "path": ["results_2024-01-13T20-13-17.595813.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-13-17.595813.parquet"]}]}]} | 2024-01-13T20:15:57+00:00 |
8856d77be1fa398343944c577121e474307744bd |
# Dataset Card for Evaluation run of Heng666/EastAsia-4x7B-Moe-experiment
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Heng666/EastAsia-4x7B-Moe-experiment](https://huggingface.co/Heng666/EastAsia-4x7B-Moe-experiment) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Heng666__EastAsia-4x7B-Moe-experiment",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:13:26.572648](https://huggingface.co/datasets/open-llm-leaderboard/details_Heng666__EastAsia-4x7B-Moe-experiment/blob/main/results_2024-01-13T20-13-26.572648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5469974383782618,
"acc_stderr": 0.03393213465072408,
"acc_norm": 0.5579608379948889,
"acc_norm_stderr": 0.03483564278598156,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.4982979181810864,
"mc2_stderr": 0.016572977538918135
},
"harness|arc:challenge|25": {
"acc": 0.36006825938566556,
"acc_stderr": 0.014027516814585188,
"acc_norm": 0.39505119453924914,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.3889663413662617,
"acc_stderr": 0.004865193237024052,
"acc_norm": 0.4892451702848038,
"acc_norm_stderr": 0.0049886269781730976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865149,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865149
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307712,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307712
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187898,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187898
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.01585200244986209,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.01585200244986209
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3983050847457627,
"acc_stderr": 0.01250331056516624,
"acc_norm": 0.3983050847457627,
"acc_norm_stderr": 0.01250331056516624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.4982979181810864,
"mc2_stderr": 0.016572977538918135
},
"harness|winogrande|5": {
"acc": 0.5808997632202052,
"acc_stderr": 0.013867325192210114
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.001071779348549261
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Heng666__EastAsia-4x7B-Moe-experiment | [
"region:us"
] | 2024-01-13T20:15:42+00:00 | {"pretty_name": "Evaluation run of Heng666/EastAsia-4x7B-Moe-experiment", "dataset_summary": "Dataset automatically created during the evaluation run of model [Heng666/EastAsia-4x7B-Moe-experiment](https://huggingface.co/Heng666/EastAsia-4x7B-Moe-experiment) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Heng666__EastAsia-4x7B-Moe-experiment\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:13:26.572648](https://huggingface.co/datasets/open-llm-leaderboard/details_Heng666__EastAsia-4x7B-Moe-experiment/blob/main/results_2024-01-13T20-13-26.572648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5469974383782618,\n \"acc_stderr\": 0.03393213465072408,\n \"acc_norm\": 0.5579608379948889,\n \"acc_norm_stderr\": 0.03483564278598156,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.4982979181810864,\n \"mc2_stderr\": 0.016572977538918135\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36006825938566556,\n \"acc_stderr\": 0.014027516814585188,\n \"acc_norm\": 0.39505119453924914,\n \"acc_norm_stderr\": 0.014285898292938163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3889663413662617,\n \"acc_stderr\": 0.004865193237024052,\n \"acc_norm\": 0.4892451702848038,\n \"acc_norm_stderr\": 0.0049886269781730976\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.03765746693865149,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.03765746693865149\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307712,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307712\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187898,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187898\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186067,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186067\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7669724770642202,\n \"acc_stderr\": 0.0181256691808615,\n \"acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.0181256691808615\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.03256685484460388,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.03256685484460388\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.01585200244986209,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.01585200244986209\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n \"acc_stderr\": 0.01250331056516624,\n \"acc_norm\": 0.3983050847457627,\n \"acc_norm_stderr\": 0.01250331056516624\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.4982979181810864,\n \"mc2_stderr\": 0.016572977538918135\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5808997632202052,\n \"acc_stderr\": 0.013867325192210114\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.001071779348549261\n }\n}\n```", "repo_url": "https://huggingface.co/Heng666/EastAsia-4x7B-Moe-experiment", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-26.572648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["**/details_harness|winogrande|5_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-13-26.572648.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_13_26.572648", "path": ["results_2024-01-13T20-13-26.572648.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-13-26.572648.parquet"]}]}]} | 2024-01-13T20:16:08+00:00 |
b700d93c3cf75e58986891b3f95f3712b69417df |
# Dataset of matsukaze/松風/松风 (Azur Lane)
This is the dataset of matsukaze/松風/松风 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `animal_ears, yellow_eyes, black_hair, fox_ears, long_hair, brown_hair, ponytail, tail, multicolored_hair, hair_between_eyes, hair_ornament, bangs, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 20.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 47 | 28.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 18.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 47 | 35.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matsukaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, open_mouth, looking_at_viewer, solo, hakama_skirt, wide_sleeves, long_sleeves, simple_background, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | looking_at_viewer | solo | hakama_skirt | wide_sleeves | long_sleeves | simple_background | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:---------------|:---------------|:---------------|:--------------------|:--------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/matsukaze_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:32:52+00:00 |
008b95c605a3c542bc740e155e6b3f9263be16be |
# Dataset of leander/リアンダー/利安得 (Azur Lane)
This is the dataset of leander/リアンダー/利安得 (Azur Lane), containing 91 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, long_hair, breasts, hairband, large_breasts, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 91 | 124.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leander_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 91 | 69.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leander_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 223 | 151.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leander_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 91 | 110.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leander_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 223 | 218.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leander_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leander_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, retrofit_(azur_lane), red_skirt, white_gloves, fingerless_gloves, smile, garter_straps, black_thighhighs, blush, pleated_skirt, short_sleeves, white_shirt, medium_breasts, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | retrofit_(azur_lane) | red_skirt | white_gloves | fingerless_gloves | smile | garter_straps | black_thighhighs | blush | pleated_skirt | short_sleeves | white_shirt | medium_breasts | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------------------|:------------|:---------------|:--------------------|:--------|:----------------|:-------------------|:--------|:----------------|:----------------|:--------------|:-----------------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/leander_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:48:27+00:00 |
2d05176676d3b42019c775b39cf37a81d8d360b9 |
# Dataset of manchester/マンチェスター/曼彻斯特 (Azur Lane)
This is the dataset of manchester/マンチェスター/曼彻斯特 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `breasts, bangs, large_breasts, grey_hair, green_eyes, hair_bun, maid_headdress, short_hair, hat, nurse_cap, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 58.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 25.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 74 | 60.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 47.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 74 | 98.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/manchester_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, black_bikini, blush, smile, collarbone, cleavage, open_mouth, twintails, aqua_eyes, maid_bikini, bare_shoulders, black_choker, frilled_bikini, outdoors, sitting, twin_braids, wrist_cuffs, bridal_garter, closed_mouth, nipples, side-tie_bikini_bottom, x_hair_ornament |
| 1 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_gloves, blush, nurse, short_sleeves, shrug_(clothing), white_thighhighs, bra, demon_wings, holding_syringe, demon_tail, heart-shaped_pupils, navel, open_mouth, simple_background, sitting, smile, white_skirt, garter_straps, medium_breasts, single_hair_bun, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | navel | solo | black_bikini | blush | smile | collarbone | cleavage | open_mouth | twintails | aqua_eyes | maid_bikini | bare_shoulders | black_choker | frilled_bikini | outdoors | sitting | twin_braids | wrist_cuffs | bridal_garter | closed_mouth | nipples | side-tie_bikini_bottom | x_hair_ornament | white_gloves | nurse | short_sleeves | shrug_(clothing) | white_thighhighs | bra | demon_wings | holding_syringe | demon_tail | heart-shaped_pupils | simple_background | white_skirt | garter_straps | medium_breasts | single_hair_bun | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:---------------|:--------|:--------|:-------------|:-----------|:-------------|:------------|:------------|:--------------|:-----------------|:---------------|:-----------------|:-----------|:----------|:--------------|:--------------|:----------------|:---------------|:----------|:-------------------------|:------------------|:---------------|:--------|:----------------|:-------------------|:-------------------|:------|:--------------|:------------------|:-------------|:----------------------|:--------------------|:--------------|:----------------|:-----------------|:------------------|:-------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | X | X | | | X | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/manchester_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:34:48+00:00 |
dd81daadb9889fe05581e57dc9b743d8524c0640 |
# Dataset of trento/トレント/特伦托 (Azur Lane)
This is the dataset of trento/トレント/特伦托 (Azur Lane), containing 60 images and their tags.
The core tags of this character are `long_hair, breasts, hair_over_one_eye, large_breasts, purple_hair, red_eyes, bangs, very_long_hair, eyewear_on_head, sunglasses, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 87.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 47.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 145 | 106.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 76.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 145 | 151.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/trento_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, black_bikini, cleavage, navel, solo, blush, looking_at_viewer, o-ring_bikini, bare_shoulders, smile, thigh_strap, collarbone, wrist_scrunchie, black_choker, thighs, bead_bracelet, open_mouth, simple_background, stomach, official_alternate_costume, side-tie_bikini_bottom, closed_mouth, multi-strapped_bikini, o-ring_top, mole, thigh_gap, wet, white_background |
| 1 | 5 |  |  |  |  |  | black_bikini, blue_sky, day, looking_at_viewer, navel, official_alternate_costume, open_mouth, 1girl, cleavage, cowboy_shot, multi-strapped_bikini, o-ring_bikini, outdoors, solo, :d, cloud, collarbone, side-tie_bikini_bottom, black_choker, bracelet, halterneck, ocean, skindentation, standing, thigh_strap |
| 2 | 12 |  |  |  |  |  | looking_at_viewer, 1girl, solo, white_gloves, cape, garter_straps, simple_background, smile, epaulettes, white_background, blush, dress, standing, black_thighhighs, boots, cleavage, red_necktie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bikini | cleavage | navel | solo | blush | looking_at_viewer | o-ring_bikini | bare_shoulders | smile | thigh_strap | collarbone | wrist_scrunchie | black_choker | thighs | bead_bracelet | open_mouth | simple_background | stomach | official_alternate_costume | side-tie_bikini_bottom | closed_mouth | multi-strapped_bikini | o-ring_top | mole | thigh_gap | wet | white_background | blue_sky | day | cowboy_shot | outdoors | :d | cloud | bracelet | halterneck | ocean | skindentation | standing | white_gloves | cape | garter_straps | epaulettes | dress | black_thighhighs | boots | red_necktie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------|:-------|:--------|:--------------------|:----------------|:-----------------|:--------|:--------------|:-------------|:------------------|:---------------|:---------|:----------------|:-------------|:--------------------|:----------|:-----------------------------|:-------------------------|:---------------|:------------------------|:-------------|:-------|:------------|:------|:-------------------|:-----------|:------|:--------------|:-----------|:-----|:--------|:-----------|:-------------|:--------|:----------------|:-----------|:---------------|:-------|:----------------|:-------------|:--------|:-------------------|:--------|:--------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | | X | | | X | | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | | X | X | X | | | X | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/trento_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:39:44+00:00 |
7d91a2257dcfbbbed21afc218fc58a732e4cd204 |
# Dataset of kent/ケント/肯特 (Azur Lane)
This is the dataset of kent/ケント/肯特 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `breasts, red_eyes, hairband, hair_between_eyes, large_breasts, short_hair, fang, purple_hair, bangs, ahoge, bow, ribbon, black_hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 29.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kent_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kent_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 61 | 38.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kent_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 25.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kent_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 61 | 56.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kent_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kent_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, blush, bare_shoulders, black_gloves, elbow_gloves, simple_background, sleeveless_shirt, upper_body, white_background, white_shirt, :d, skirt, white_apron |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | open_mouth | blush | bare_shoulders | black_gloves | elbow_gloves | simple_background | sleeveless_shirt | upper_body | white_background | white_shirt | :d | skirt | white_apron |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:--------|:-----------------|:---------------|:---------------|:--------------------|:-------------------|:-------------|:-------------------|:--------------|:-----|:--------|:--------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kent_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:32:39+00:00 |
e6569ed2960624988cfb36d9025040f7fc0be7c1 |
# Dataset Card for Evaluation run of charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B](https://huggingface.co/charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_charlesdedampierre__TopicNeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:19:31.036723](https://huggingface.co/datasets/open-llm-leaderboard/details_charlesdedampierre__TopicNeuralHermes-2.5-Mistral-7B/blob/main/results_2024-01-13T20-19-31.036723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377095946098038,
"acc_stderr": 0.03228135828297783,
"acc_norm": 0.64089103621422,
"acc_norm_stderr": 0.032919379883826004,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5546784964324225,
"mc2_stderr": 0.015236087364473834
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893456,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6623182632941645,
"acc_stderr": 0.004719529099913136,
"acc_norm": 0.8544114718183629,
"acc_norm_stderr": 0.003519724163310887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530333,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573504,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662264,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662264
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3139664804469274,
"acc_stderr": 0.015521923933523628,
"acc_norm": 0.3139664804469274,
"acc_norm_stderr": 0.015521923933523628
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5546784964324225,
"mc2_stderr": 0.015236087364473834
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.01158587171020941
},
"harness|gsm8k|5": {
"acc": 0.5420773313115997,
"acc_stderr": 0.013723629649844079
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_charlesdedampierre__TopicNeuralHermes-2.5-Mistral-7B | [
"region:us"
] | 2024-01-13T20:21:50+00:00 | {"pretty_name": "Evaluation run of charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B](https://huggingface.co/charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_charlesdedampierre__TopicNeuralHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:19:31.036723](https://huggingface.co/datasets/open-llm-leaderboard/details_charlesdedampierre__TopicNeuralHermes-2.5-Mistral-7B/blob/main/results_2024-01-13T20-19-31.036723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377095946098038,\n \"acc_stderr\": 0.03228135828297783,\n \"acc_norm\": 0.64089103621422,\n \"acc_norm_stderr\": 0.032919379883826004,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5546784964324225,\n \"mc2_stderr\": 0.015236087364473834\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893456,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n \"acc_stderr\": 0.004719529099913136,\n \"acc_norm\": 0.8544114718183629,\n \"acc_norm_stderr\": 0.003519724163310887\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530333,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530333\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573504,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573504\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662264,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662264\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3139664804469274,\n \"acc_stderr\": 0.015521923933523628,\n \"acc_norm\": 0.3139664804469274,\n \"acc_norm_stderr\": 0.015521923933523628\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5546784964324225,\n \"mc2_stderr\": 0.015236087364473834\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5420773313115997,\n \"acc_stderr\": 0.013723629649844079\n }\n}\n```", "repo_url": "https://huggingface.co/charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-19-31.036723.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["**/details_harness|winogrande|5_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-19-31.036723.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_19_31.036723", "path": ["results_2024-01-13T20-19-31.036723.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-19-31.036723.parquet"]}]}]} | 2024-01-13T20:22:11+00:00 |
5744b5a3ae05159b541531e6b2f949a92490b8bf |
# Dataset of asanagi/朝凪/朝凪 (Azur Lane)
This is the dataset of asanagi/朝凪/朝凪 (Azur Lane), containing 34 images and their tags.
The core tags of this character are `animal_ears, long_hair, bangs, animal_ear_fluff, yellow_eyes, fox_ears, twintails, blunt_bangs, breasts, grey_hair, fox_girl, tail, braid, very_long_hair, fox_tail, small_breasts, bow, white_hair, hair_bow, red_bow, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 58.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asanagi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 25.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asanagi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 88 | 60.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asanagi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 47.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asanagi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 88 | 94.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asanagi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/asanagi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, white_thighhighs, solo, white_bikini, open_mouth, blush, micro_bikini, navel, smile, collarbone, side-tie_bikini_bottom, armpits, day, skindentation, sky, toeless_legwear |
| 1 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, detached_sleeves, black_thighhighs, open_mouth, black_gloves, blush, fingerless_gloves, simple_background, leotard, wide_sleeves, :d, japanese_clothes, navel, sword, white_background, bare_shoulders, clothing_cutout, fangs, ribbon_trim, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | white_thighhighs | solo | white_bikini | open_mouth | blush | micro_bikini | navel | smile | collarbone | side-tie_bikini_bottom | armpits | day | skindentation | sky | toeless_legwear | detached_sleeves | black_thighhighs | black_gloves | fingerless_gloves | simple_background | leotard | wide_sleeves | :d | japanese_clothes | sword | white_background | bare_shoulders | clothing_cutout | fangs | ribbon_trim | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------------|:-------|:---------------|:-------------|:--------|:---------------|:--------|:--------|:-------------|:-------------------------|:----------|:------|:----------------|:------|:------------------|:-------------------|:-------------------|:---------------|:--------------------|:--------------------|:----------|:---------------|:-----|:-------------------|:--------|:-------------------|:-----------------|:------------------|:--------|:--------------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | | X | | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/asanagi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:21:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:29:33+00:00 |
534e250d473e65f69deb1a71d1eaf357a81e745c | lab42/refclef-v3 | [
"region:us"
] | 2024-01-13T20:22:59+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 7342209198.568, "num_examples": 17978}, {"name": "validation", "num_bytes": 898268528.0, "num_examples": 2000}, {"name": "test", "num_bytes": 327013573.112, "num_examples": 1344}], "download_size": 2323875072, "dataset_size": 8567491299.68}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-13T20:26:52+00:00 |
|
9b230530b84c0188ecca6e31c971c828de4e00da |
# Dataset Card for Evaluation run of diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B](https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:20:55.847857](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B/blob/main/results_2024-01-13T20-20-55.847857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5664354333372791,
"acc_stderr": 0.033523024960411534,
"acc_norm": 0.5714502762424973,
"acc_norm_stderr": 0.034221321166461816,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125136,
"mc2": 0.5143942772336377,
"mc2_stderr": 0.015015865193028501
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221009,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192598
},
"harness|hellaswag|10": {
"acc": 0.638020314678351,
"acc_stderr": 0.004795908282584543,
"acc_norm": 0.8386775542720574,
"acc_norm_stderr": 0.003670763673792967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529638,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529638
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46256983240223465,
"acc_stderr": 0.016675578687308082,
"acc_norm": 0.46256983240223465,
"acc_norm_stderr": 0.016675578687308082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097074,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125136,
"mc2": 0.5143942772336377,
"mc2_stderr": 0.015015865193028501
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838234
},
"harness|gsm8k|5": {
"acc": 0.30401819560272936,
"acc_stderr": 0.012670420440198662
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B | [
"region:us"
] | 2024-01-13T20:23:14+00:00 | {"pretty_name": "Evaluation run of diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B](https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:20:55.847857](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Psyfighter2-Noromaid-ties-Capybara-13B/blob/main/results_2024-01-13T20-20-55.847857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5664354333372791,\n \"acc_stderr\": 0.033523024960411534,\n \"acc_norm\": 0.5714502762424973,\n \"acc_norm_stderr\": 0.034221321166461816,\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.016656997109125136,\n \"mc2\": 0.5143942772336377,\n \"mc2_stderr\": 0.015015865193028501\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221009,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192598\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.638020314678351,\n \"acc_stderr\": 0.004795908282584543,\n \"acc_norm\": 0.8386775542720574,\n \"acc_norm_stderr\": 0.003670763673792967\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534734,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534734\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n \"acc_stderr\": 0.015517322365529638,\n \"acc_norm\": 0.7484035759897829,\n \"acc_norm_stderr\": 0.015517322365529638\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46256983240223465,\n \"acc_stderr\": 0.016675578687308082,\n \"acc_norm\": 0.46256983240223465,\n \"acc_norm_stderr\": 0.016675578687308082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.016656997109125136,\n \"mc2\": 0.5143942772336377,\n \"mc2_stderr\": 0.015015865193028501\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30401819560272936,\n \"acc_stderr\": 0.012670420440198662\n }\n}\n```", "repo_url": "https://huggingface.co/diffnamehard/Psyfighter2-Noromaid-ties-Capybara-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["**/details_harness|winogrande|5_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-20-55.847857.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_20_55.847857", "path": ["results_2024-01-13T20-20-55.847857.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-20-55.847857.parquet"]}]}]} | 2024-01-13T20:23:34+00:00 |
4c280edb4a4b1f6978ef8b3d3e9d70c600572f50 | MyRebRIc/datasetdoschwoz | [
"region:us"
] | 2024-01-13T20:23:54+00:00 | {} | 2024-01-13T20:24:25+00:00 |
|
ee46601ee2de700abe3d0a789f8ea867f45f80ca | bikram20/bg-temp-dataset | [
"license:mit",
"region:us"
] | 2024-01-13T20:25:34+00:00 | {"license": "mit"} | 2024-01-13T20:25:34+00:00 |
|
41d618511ea3f3a3b472a3bafd90a8a3913e3b34 |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:29:58.885355](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca/blob/main/results_2024-01-13T20-29-58.885355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6191660640057981,
"acc_stderr": 0.03263652891344978,
"acc_norm": 0.6271945727055741,
"acc_norm_stderr": 0.03333445432068468,
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6580362477594105,
"acc_stderr": 0.004733980470799212,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.0035997580435468044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914389,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914389
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209408
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca | [
"region:us"
] | 2024-01-13T20:32:17+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:29:58.885355](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca/blob/main/results_2024-01-13T20-29-58.885355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6191660640057981,\n \"acc_stderr\": 0.03263652891344978,\n \"acc_norm\": 0.6271945727055741,\n \"acc_norm_stderr\": 0.03333445432068468,\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6580362477594105,\n \"acc_stderr\": 0.004733980470799212,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.0035997580435468044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914389,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914389\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209408\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \"acc_stderr\": 0.011083227665267797\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-29-58.885355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["**/details_harness|winogrande|5_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-29-58.885355.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_29_58.885355", "path": ["results_2024-01-13T20-29-58.885355.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-29-58.885355.parquet"]}]}]} | 2024-01-13T20:32:38+00:00 |
0529b8fe966a2bb02530263f2a987be078f71c76 | Darok/gov | [
"region:us"
] | 2024-01-13T20:32:50+00:00 | {} | 2024-01-13T20:33:46+00:00 |
|
ab9ed84fe9ee8fb420f99b8ccffb52d74184ddfe |
# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-V0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-7B-V0.1](https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:34:27.461906](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.1/blob/main/results_2024-01-13T20-34-27.461906.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6475482692691918,
"acc_stderr": 0.03229281059357549,
"acc_norm": 0.64912528215574,
"acc_norm_stderr": 0.03294231215287106,
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371255,
"mc2": 0.6721952166431592,
"mc2_stderr": 0.015433999381498234
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518822,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.7062338179645489,
"acc_stderr": 0.004545552424153379,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.0033225528296089036
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.02501074911613759,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.02501074911613759
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02865749128507197,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02865749128507197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811452,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5042839657282742,
"mc1_stderr": 0.017502858577371255,
"mc2": 0.6721952166431592,
"mc2_stderr": 0.015433999381498234
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.01116812059356957
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.01352484889446211
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.1 | [
"region:us"
] | 2024-01-13T20:36:43+00:00 | {"pretty_name": "Evaluation run of RatanRohith/NeuralPizza-7B-V0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-7B-V0.1](https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:34:27.461906](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-V0.1/blob/main/results_2024-01-13T20-34-27.461906.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6475482692691918,\n \"acc_stderr\": 0.03229281059357549,\n \"acc_norm\": 0.64912528215574,\n \"acc_norm_stderr\": 0.03294231215287106,\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.017502858577371255,\n \"mc2\": 0.6721952166431592,\n \"mc2_stderr\": 0.015433999381498234\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7062338179645489,\n \"acc_stderr\": 0.004545552424153379,\n \"acc_norm\": 0.8730332603067118,\n \"acc_norm_stderr\": 0.0033225528296089036\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613759,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613759\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02865749128507197,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02865749128507197\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n \"acc_stderr\": 0.01646320023811452,\n \"acc_norm\": 0.4122905027932961,\n \"acc_norm_stderr\": 0.01646320023811452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n \"mc1_stderr\": 0.017502858577371255,\n \"mc2\": 0.6721952166431592,\n \"mc2_stderr\": 0.015433999381498234\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.01116812059356957\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \"acc_stderr\": 0.01352484889446211\n }\n}\n```", "repo_url": "https://huggingface.co/RatanRohith/NeuralPizza-7B-V0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-34-27.461906.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["**/details_harness|winogrande|5_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-34-27.461906.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_34_27.461906", "path": ["results_2024-01-13T20-34-27.461906.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-34-27.461906.parquet"]}]}]} | 2024-01-13T20:37:04+00:00 |
0dc4892cef7e5b88060b6455feaf85f7ff98bde8 | biadrivex/bonito | [
"license:openrail",
"region:us"
] | 2024-01-13T20:38:12+00:00 | {"license": "openrail"} | 2024-01-13T20:39:05+00:00 |
|
20bb86190831686862fd9433bb417c5f1a2506d6 | The dataset contains multi-modal data from over 75,000 open access and de-identified case reports, including metadata, clinical cases, image captions and more than 130,000 images. Images and clinical cases belong to different medical specialties, such as oncology, cardiology, surgery and pathology. The structure of the dataset allows to easily map images with their corresponding article metadata, clinical case, captions and image labels. Details of the data structure can be found in the file data_dictionary.csv.
Almost 100,000 patients and almost 400,000 medical doctors and researchers were involved in the creation of the articles included in this dataset. The citation data of each article can be found in the metadata.parquet file.
Refer to the examples showcased in [this GitHub repository](https://github.com/mauro-nievoff/MultiCaRe_Dataset) to understand how to optimize the use of this dataset.
For a detailed insight about the contents of this dataset, please refer to [this data article](https://www.sciencedirect.com/science/article/pii/S2352340923010351) published in Data In Brief.
The dataset is also available on [Zenodo](https://zenodo.org/records/10079370). | mauro-nievoff/MultiCaRe_Dataset | [
"task_categories:image-classification",
"task_categories:image-to-text",
"task_categories:text-to-image",
"language:en",
"license:cc-by-4.0",
"medical",
"images",
"computer vision",
"multimodal",
"text",
"clinical",
"nlp",
"region:us"
] | 2024-01-13T20:38:21+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["image-classification", "image-to-text", "text-to-image"], "pretty_name": "MultiCaRe Dataset", "tags": ["medical", "images", "computer vision", "multimodal", "text", "clinical", "nlp"]} | 2024-01-14T15:02:24+00:00 |
e2dcadae9756394104348ed94b7b580d102e7f94 |
# Dataset Card for Evaluation run of kevin009/lamatama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/lamatama](https://huggingface.co/kevin009/lamatama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__lamatama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:41:45.535254](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__lamatama/blob/main/results_2024-01-13T20-41-45.535254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25449022748027395,
"acc_stderr": 0.030683186117771787,
"acc_norm": 0.2553066382594162,
"acc_norm_stderr": 0.031421716720794905,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3767314036539428,
"mc2_stderr": 0.013774459138435797
},
"harness|arc:challenge|25": {
"acc": 0.34726962457337884,
"acc_stderr": 0.013913034529620436,
"acc_norm": 0.363481228668942,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.45777733519219277,
"acc_stderr": 0.004971958480920495,
"acc_norm": 0.6112328221469827,
"acc_norm_stderr": 0.004864740134043669
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.14814814814814814,
"acc_stderr": 0.030688647610352674,
"acc_norm": 0.14814814814814814,
"acc_norm_stderr": 0.030688647610352674
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869337,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869337
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307811,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307811
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415433,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523809,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523809
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961477,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961477
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.02168554666533319,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.02168554666533319
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602158,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602158
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674036,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674036
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803656,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321624,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22569832402234638,
"acc_stderr": 0.013981395058455052,
"acc_norm": 0.22569832402234638,
"acc_norm_stderr": 0.013981395058455052
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642976,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642976
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203496,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.01777694715752804,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.01777694715752804
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457921,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457921
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3767314036539428,
"mc2_stderr": 0.013774459138435797
},
"harness|winogrande|5": {
"acc": 0.6077348066298343,
"acc_stderr": 0.013722400462000885
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__lamatama | [
"region:us"
] | 2024-01-13T20:43:34+00:00 | {"pretty_name": "Evaluation run of kevin009/lamatama", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/lamatama](https://huggingface.co/kevin009/lamatama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__lamatama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:41:45.535254](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__lamatama/blob/main/results_2024-01-13T20-41-45.535254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25449022748027395,\n \"acc_stderr\": 0.030683186117771787,\n \"acc_norm\": 0.2553066382594162,\n \"acc_norm_stderr\": 0.031421716720794905,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3767314036539428,\n \"mc2_stderr\": 0.013774459138435797\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34726962457337884,\n \"acc_stderr\": 0.013913034529620436,\n \"acc_norm\": 0.363481228668942,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45777733519219277,\n \"acc_stderr\": 0.004971958480920495,\n \"acc_norm\": 0.6112328221469827,\n \"acc_norm_stderr\": 0.004864740134043669\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.14814814814814814,\n \"acc_stderr\": 0.030688647610352674,\n \"acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.030688647610352674\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.029957851329869337,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.029957851329869337\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307811,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307811\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415433,\n \"acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523809,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523809\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.22258064516129034,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961477,\n \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961477\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.02168554666533319,\n \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.02168554666533319\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827947,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827947\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.029745048572674036,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.029745048572674036\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n \"acc_stderr\": 0.01605079214803656,\n \"acc_norm\": 0.2796934865900383,\n \"acc_norm_stderr\": 0.01605079214803656\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321624,\n \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321624\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22569832402234638,\n \"acc_stderr\": 0.013981395058455052,\n \"acc_norm\": 0.22569832402234638,\n \"acc_norm_stderr\": 0.013981395058455052\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046112,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046112\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642976,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642976\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n \"acc_stderr\": 0.01092649610203496,\n \"acc_norm\": 0.24119947848761408,\n \"acc_norm_stderr\": 0.01092649610203496\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.01777694715752804,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.01777694715752804\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3767314036539428,\n \"mc2_stderr\": 0.013774459138435797\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6077348066298343,\n \"acc_stderr\": 0.013722400462000885\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.004106620637749707\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/lamatama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-41-45.535254.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["**/details_harness|winogrande|5_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-41-45.535254.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_41_45.535254", "path": ["results_2024-01-13T20-41-45.535254.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-41-45.535254.parquet"]}]}]} | 2024-01-13T20:43:55+00:00 |
da169aea4b814f9e072078d7e6091823c8a876d0 |
# Dataset of atlanta/アトランタ/亚特兰大 (Azur Lane)
This is the dataset of atlanta/アトランタ/亚特兰大 (Azur Lane), containing 19 images and their tags.
The core tags of this character are `pink_hair, blue_eyes, braid, long_hair, ahoge, bangs, crown_braid, black_ribbon, hair_ribbon, ribbon, breasts, hair_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 16.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 11.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 20.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 14.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 24.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/atlanta_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/atlanta_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fingerless_gloves, red_necktie, white_shirt, bare_shoulders, blue_skirt, pleated_skirt, white_thighhighs, blush, simple_background, single_thighhigh, white_background, detached_collar, miniskirt, detached_sleeves, off-shoulder_shirt, open_mouth, smile, red_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | fingerless_gloves | red_necktie | white_shirt | bare_shoulders | blue_skirt | pleated_skirt | white_thighhighs | blush | simple_background | single_thighhigh | white_background | detached_collar | miniskirt | detached_sleeves | off-shoulder_shirt | open_mouth | smile | red_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------------|:--------------|:--------------|:-----------------|:-------------|:----------------|:-------------------|:--------|:--------------------|:-------------------|:-------------------|:------------------|:------------|:-------------------|:---------------------|:-------------|:--------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/atlanta_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:43:34+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:48:11+00:00 |
f69fa4ce6b5843be6485888ebb66c9cda9550e0b |
# Dataset of chitose/千歳/千岁 (Azur Lane)
This is the dataset of chitose/千歳/千岁 (Azur Lane), containing 32 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, red_hair, purple_eyes, bangs, hat, mask_on_head, sun_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 32 | 56.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chitose_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 32 | 31.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chitose_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 62.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chitose_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 32 | 50.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chitose_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 94.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chitose_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chitose_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, red_bikini, straw_hat, solo, side-tie_bikini_bottom, blush, outdoors, purple_hair |
| 1 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, fox_mask, solo, bare_shoulders, cleavage, blush, japanese_clothes, skirt, wide_sleeves, simple_background, veil, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | red_bikini | straw_hat | solo | side-tie_bikini_bottom | blush | outdoors | purple_hair | fox_mask | bare_shoulders | cleavage | japanese_clothes | skirt | wide_sleeves | simple_background | veil | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:------------|:-------|:-------------------------|:--------|:-----------|:--------------|:-----------|:-----------------|:-----------|:-------------------|:--------|:---------------|:--------------------|:-------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | | | X | | X | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/chitose_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T20:43:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T20:55:29+00:00 |
a27fd37029bb797ff87139b156d43e67469825cf |
# Dataset Card for Evaluation run of VitalContribution/Evangelion-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [VitalContribution/Evangelion-7B](https://huggingface.co/VitalContribution/Evangelion-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VitalContribution__Evangelion-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:42:12.664551](https://huggingface.co/datasets/open-llm-leaderboard/details_VitalContribution__Evangelion-7B/blob/main/results_2024-01-13T20-42-12.664551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6436279768533524,
"acc_stderr": 0.03212568059056159,
"acc_norm": 0.6443410769989211,
"acc_norm_stderr": 0.032776722271689644,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.017485542258489646,
"mc2": 0.6400670036423886,
"mc2_stderr": 0.014997645589691178
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145678,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053064
},
"harness|hellaswag|10": {
"acc": 0.6756622186815375,
"acc_stderr": 0.004671701705567242,
"acc_norm": 0.8644692292372037,
"acc_norm_stderr": 0.0034159007223818934
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229876,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229876
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374294,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374294
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865474,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170605,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170605
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.017485542258489646,
"mc2": 0.6400670036423886,
"mc2_stderr": 0.014997645589691178
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205067
},
"harness|gsm8k|5": {
"acc": 0.6694465504169825,
"acc_stderr": 0.012957496367085028
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_VitalContribution__Evangelion-7B | [
"region:us"
] | 2024-01-13T20:44:31+00:00 | {"pretty_name": "Evaluation run of VitalContribution/Evangelion-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [VitalContribution/Evangelion-7B](https://huggingface.co/VitalContribution/Evangelion-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VitalContribution__Evangelion-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:42:12.664551](https://huggingface.co/datasets/open-llm-leaderboard/details_VitalContribution__Evangelion-7B/blob/main/results_2024-01-13T20-42-12.664551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6436279768533524,\n \"acc_stderr\": 0.03212568059056159,\n \"acc_norm\": 0.6443410769989211,\n \"acc_norm_stderr\": 0.032776722271689644,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6400670036423886,\n \"mc2_stderr\": 0.014997645589691178\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145678,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053064\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6756622186815375,\n \"acc_stderr\": 0.004671701705567242,\n \"acc_norm\": 0.8644692292372037,\n \"acc_norm_stderr\": 0.0034159007223818934\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229876,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229876\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374294,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374294\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865474,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865474\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170605,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170605\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6400670036423886,\n \"mc2_stderr\": 0.014997645589691178\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205067\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6694465504169825,\n \"acc_stderr\": 0.012957496367085028\n }\n}\n```", "repo_url": "https://huggingface.co/VitalContribution/Evangelion-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-42-12.664551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["**/details_harness|winogrande|5_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-42-12.664551.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_42_12.664551", "path": ["results_2024-01-13T20-42-12.664551.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-42-12.664551.parquet"]}]}]} | 2024-01-13T20:44:53+00:00 |
9a2b35a5cdca39c8ff54d83f4b315079b3ff0d3b | speed1/rockgerio | [
"license:openrail",
"region:us"
] | 2024-01-13T20:46:26+00:00 | {"license": "openrail"} | 2024-01-13T20:46:44+00:00 |
|
62851b1015a6f4dcae2e2b5b6016794f5d6d0e04 |
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.3](https://huggingface.co/andysalerno/openchat-nectar-0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:54:22.741821](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3/blob/main/results_2024-01-13T20-54-22.741821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6567583142066592,
"acc_stderr": 0.03181491575998619,
"acc_norm": 0.6577099601186088,
"acc_norm_stderr": 0.03246653894337514,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5238305779540199,
"mc2_stderr": 0.015355411809850603
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000326,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892976
},
"harness|hellaswag|10": {
"acc": 0.6349332802230632,
"acc_stderr": 0.004804649197163696,
"acc_norm": 0.8315076677952599,
"acc_norm_stderr": 0.0037353793752550124
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924006,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768756,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527836,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.012767098998525846,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.012767098998525846
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5238305779540199,
"mc2_stderr": 0.015355411809850603
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215687
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851806
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3 | [
"region:us"
] | 2024-01-13T20:56:41+00:00 | {"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.3](https://huggingface.co/andysalerno/openchat-nectar-0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:54:22.741821](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3/blob/main/results_2024-01-13T20-54-22.741821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567583142066592,\n \"acc_stderr\": 0.03181491575998619,\n \"acc_norm\": 0.6577099601186088,\n \"acc_norm_stderr\": 0.03246653894337514,\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5238305779540199,\n \"mc2_stderr\": 0.015355411809850603\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000326,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892976\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6349332802230632,\n \"acc_stderr\": 0.004804649197163696,\n \"acc_norm\": 0.8315076677952599,\n \"acc_norm_stderr\": 0.0037353793752550124\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768756,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768756\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527836,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527836\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n \"acc_stderr\": 0.012767098998525846,\n \"acc_norm\": 0.48891786179921776,\n \"acc_norm_stderr\": 0.012767098998525846\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5238305779540199,\n \"mc2_stderr\": 0.015355411809850603\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215687\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.012880360794851806\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-22.741821.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["**/details_harness|winogrande|5_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-54-22.741821.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_54_22.741821", "path": ["results_2024-01-13T20-54-22.741821.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-54-22.741821.parquet"]}]}]} | 2024-01-13T20:57:01+00:00 |
a3b1592cec488197d020882632e408b35f928556 |
# Dataset Card for Evaluation run of TomGrc/FusionNet_7Bx2_MoE_14B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_7Bx2_MoE_14B](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_14B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T20:54:40.913978](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_14B/blob/main/results_2024-01-13T20-54-40.913978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538972683722968,
"acc_stderr": 0.0320392240520873,
"acc_norm": 0.6524071431642017,
"acc_norm_stderr": 0.032729125210872866,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.6959760844818743,
"mc2_stderr": 0.01511859520186482
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.013340916085246258,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.7274447321250747,
"acc_stderr": 0.004443639394177423,
"acc_norm": 0.8883688508265286,
"acc_norm_stderr": 0.0031426851645672675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066304,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342507,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482705,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482705
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.6959760844818743,
"mc2_stderr": 0.01511859520186482
},
"harness|winogrande|5": {
"acc": 0.8816101026045777,
"acc_stderr": 0.009079851554821855
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.012541830815461492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_14B | [
"region:us"
] | 2024-01-13T20:56:55+00:00 | {"pretty_name": "Evaluation run of TomGrc/FusionNet_7Bx2_MoE_14B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TomGrc/FusionNet_7Bx2_MoE_14B](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_14B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T20:54:40.913978](https://huggingface.co/datasets/open-llm-leaderboard/details_TomGrc__FusionNet_7Bx2_MoE_14B/blob/main/results_2024-01-13T20-54-40.913978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538972683722968,\n \"acc_stderr\": 0.0320392240520873,\n \"acc_norm\": 0.6524071431642017,\n \"acc_norm_stderr\": 0.032729125210872866,\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.6959760844818743,\n \"mc2_stderr\": 0.01511859520186482\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246258,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7274447321250747,\n \"acc_stderr\": 0.004443639394177423,\n \"acc_norm\": 0.8883688508265286,\n \"acc_norm_stderr\": 0.0031426851645672675\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066304,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342507,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342507\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482705,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482705\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.6959760844818743,\n \"mc2_stderr\": 0.01511859520186482\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8816101026045777,\n \"acc_stderr\": 0.009079851554821855\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \"acc_stderr\": 0.012541830815461492\n }\n}\n```", "repo_url": "https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-40.913978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["**/details_harness|winogrande|5_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T20-54-40.913978.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T20_54_40.913978", "path": ["results_2024-01-13T20-54-40.913978.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T20-54-40.913978.parquet"]}]}]} | 2024-01-13T20:57:16+00:00 |
1fca528639712f17883ef2a9bbfc9cd155871282 | kentlbeck/tidy-first | [
"region:us"
] | 2024-01-13T21:04:09+00:00 | {} | 2024-01-13T21:07:32+00:00 |
|
e735be945927f78486655974dfac7ce3deacebf6 | malucoelhaofc/TolkienV2 | [
"license:openrail",
"region:us"
] | 2024-01-13T21:06:05+00:00 | {"license": "openrail"} | 2024-02-06T18:47:23+00:00 |
|
90489b1aed11259c248c06524dae62a199eab09a |
# Dataset Card for Evaluation run of SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16](https://huggingface.co/SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SicariusSicariiStuff__Tenebra_30B_Alpha01_FP16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T21:09:42.691058](https://huggingface.co/datasets/open-llm-leaderboard/details_SicariusSicariiStuff__Tenebra_30B_Alpha01_FP16/blob/main/results_2024-01-13T21-09-42.691058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5451166400770657,
"acc_stderr": 0.033739092927066276,
"acc_norm": 0.5497582392701649,
"acc_norm_stderr": 0.03446388375818918,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.542164417620334,
"mc2_stderr": 0.015177868903320643
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094097
},
"harness|hellaswag|10": {
"acc": 0.6488747261501693,
"acc_stderr": 0.004763465139038567,
"acc_norm": 0.8479386576379208,
"acc_norm_stderr": 0.0035834648107534763
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315524,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315524
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657228,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817223,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817223
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.025203571773028333,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.025203571773028333
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.018946022322225607,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.018946022322225607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.034028015813589656,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.034028015813589656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.032867453125679603,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.032867453125679603
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930638,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930638
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2927374301675978,
"acc_stderr": 0.015218109544410177,
"acc_norm": 0.2927374301675978,
"acc_norm_stderr": 0.015218109544410177
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664278,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664278
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460645,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460645
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835546,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835546
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.0467375233367024,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.0467375233367024
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.542164417620334,
"mc2_stderr": 0.015177868903320643
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.01152446695409025
},
"harness|gsm8k|5": {
"acc": 0.24639878695981804,
"acc_stderr": 0.011869498557755346
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SicariusSicariiStuff__Tenebra_30B_Alpha01_FP16 | [
"region:us"
] | 2024-01-13T21:12:00+00:00 | {"pretty_name": "Evaluation run of SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16", "dataset_summary": "Dataset automatically created during the evaluation run of model [SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16](https://huggingface.co/SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SicariusSicariiStuff__Tenebra_30B_Alpha01_FP16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T21:09:42.691058](https://huggingface.co/datasets/open-llm-leaderboard/details_SicariusSicariiStuff__Tenebra_30B_Alpha01_FP16/blob/main/results_2024-01-13T21-09-42.691058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5451166400770657,\n \"acc_stderr\": 0.033739092927066276,\n \"acc_norm\": 0.5497582392701649,\n \"acc_norm_stderr\": 0.03446388375818918,\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.542164417620334,\n \"mc2_stderr\": 0.015177868903320643\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094097\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6488747261501693,\n \"acc_stderr\": 0.004763465139038567,\n \"acc_norm\": 0.8479386576379208,\n \"acc_norm_stderr\": 0.0035834648107534763\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315524,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315524\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.02659308451657228,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.02659308451657228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817223,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817223\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.025203571773028333,\n \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.025203571773028333\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478466,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478466\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7339449541284404,\n \"acc_stderr\": 0.018946022322225607,\n \"acc_norm\": 0.7339449541284404,\n \"acc_norm_stderr\": 0.018946022322225607\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.034028015813589656,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.034028015813589656\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.032867453125679603,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.032867453125679603\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.7381864623243933,\n \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930638,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930638\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2927374301675978,\n \"acc_stderr\": 0.015218109544410177,\n \"acc_norm\": 0.2927374301675978,\n \"acc_norm_stderr\": 0.015218109544410177\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n \"acc_stderr\": 0.012499840347460645,\n \"acc_norm\": 0.39765319426336376,\n \"acc_norm_stderr\": 0.012499840347460645\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835546,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835546\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.0467375233367024,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.0467375233367024\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.542164417620334,\n \"mc2_stderr\": 0.015177868903320643\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.01152446695409025\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24639878695981804,\n \"acc_stderr\": 0.011869498557755346\n }\n}\n```", "repo_url": "https://huggingface.co/SicariusSicariiStuff/Tenebra_30B_Alpha01_FP16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-09-42.691058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["**/details_harness|winogrande|5_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T21-09-42.691058.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T21_09_42.691058", "path": ["results_2024-01-13T21-09-42.691058.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T21-09-42.691058.parquet"]}]}]} | 2024-01-13T21:12:26+00:00 |
1732d912289a0ccdc713935d4d138db70940c09e |
# Dataset Card for Evaluation run of PotatoOff/HamSter-0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PotatoOff/HamSter-0.1](https://huggingface.co/PotatoOff/HamSter-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PotatoOff__HamSter-0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T21:11:37.945575](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.1/blob/main/results_2024-01-13T21-11-37.945575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4276097079092565,
"acc_stderr": 0.034155225676891506,
"acc_norm": 0.4351931278757911,
"acc_norm_stderr": 0.03507757220797634,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5124400262486783,
"mc2_stderr": 0.016311439508262023
},
"harness|arc:challenge|25": {
"acc": 0.4351535836177474,
"acc_stderr": 0.014487986197186047,
"acc_norm": 0.46928327645051193,
"acc_norm_stderr": 0.014583792546304038
},
"harness|hellaswag|10": {
"acc": 0.5039832702648874,
"acc_stderr": 0.004989623068778789,
"acc_norm": 0.6808404700258912,
"acc_norm_stderr": 0.004651982864043485
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.03883565977956929,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.03883565977956929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5595854922279793,
"acc_stderr": 0.03582724530036094,
"acc_norm": 0.5595854922279793,
"acc_norm_stderr": 0.03582724530036094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.02486499515976776,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.02486499515976776
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.0316314580755238,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.0316314580755238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.021432956203453313,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.021432956203453313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5,
"acc_stderr": 0.03509312031717982,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03509312031717982
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5738396624472574,
"acc_stderr": 0.03219035703131774,
"acc_norm": 0.5738396624472574,
"acc_norm_stderr": 0.03219035703131774
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319771,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319771
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.5242718446601942,
"acc_stderr": 0.04944901092973779,
"acc_norm": 0.5242718446601942,
"acc_norm_stderr": 0.04944901092973779
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343117,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.050241839379569095,
"acc_norm": 0.49,
"acc_norm_stderr": 0.050241839379569095
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5938697318007663,
"acc_stderr": 0.017562037406478912,
"acc_norm": 0.5938697318007663,
"acc_norm_stderr": 0.017562037406478912
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.407514450867052,
"acc_stderr": 0.026454578146931498,
"acc_norm": 0.407514450867052,
"acc_norm_stderr": 0.026454578146931498
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4405144694533762,
"acc_stderr": 0.028196400574197426,
"acc_norm": 0.4405144694533762,
"acc_norm_stderr": 0.028196400574197426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.0280459469420424,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.0280459469420424
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3239895697522816,
"acc_stderr": 0.011952840809646575,
"acc_norm": 0.3239895697522816,
"acc_norm_stderr": 0.011952840809646575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.02643132987078953,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.02643132987078953
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.019659922493623336,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.019659922493623336
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5522388059701493,
"acc_stderr": 0.03516184772952166,
"acc_norm": 0.5522388059701493,
"acc_norm_stderr": 0.03516184772952166
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.0368078369072758,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.0368078369072758
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.03786720706234214,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.03786720706234214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.5124400262486783,
"mc2_stderr": 0.016311439508262023
},
"harness|winogrande|5": {
"acc": 0.6187845303867403,
"acc_stderr": 0.013650172164160318
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PotatoOff__HamSter-0.1 | [
"region:us"
] | 2024-01-13T21:13:54+00:00 | {"pretty_name": "Evaluation run of PotatoOff/HamSter-0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [PotatoOff/HamSter-0.1](https://huggingface.co/PotatoOff/HamSter-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PotatoOff__HamSter-0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T21:11:37.945575](https://huggingface.co/datasets/open-llm-leaderboard/details_PotatoOff__HamSter-0.1/blob/main/results_2024-01-13T21-11-37.945575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4276097079092565,\n \"acc_stderr\": 0.034155225676891506,\n \"acc_norm\": 0.4351931278757911,\n \"acc_norm_stderr\": 0.03507757220797634,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5124400262486783,\n \"mc2_stderr\": 0.016311439508262023\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4351535836177474,\n \"acc_stderr\": 0.014487986197186047,\n \"acc_norm\": 0.46928327645051193,\n \"acc_norm_stderr\": 0.014583792546304038\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5039832702648874,\n \"acc_stderr\": 0.004989623068778789,\n \"acc_norm\": 0.6808404700258912,\n \"acc_norm_stderr\": 0.004651982864043485\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5595854922279793,\n \"acc_stderr\": 0.03582724530036094,\n \"acc_norm\": 0.5595854922279793,\n \"acc_norm_stderr\": 0.03582724530036094\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.02486499515976776,\n \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.02486499515976776\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.0316314580755238,\n \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.0316314580755238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5100917431192661,\n \"acc_stderr\": 0.021432956203453313,\n \"acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.021432956203453313\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03509312031717982,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03509312031717982\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5738396624472574,\n \"acc_stderr\": 0.03219035703131774,\n \"acc_norm\": 0.5738396624472574,\n \"acc_norm_stderr\": 0.03219035703131774\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319771,\n \"acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319771\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.0392237829061099,\n \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.0392237829061099\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5242718446601942,\n \"acc_stderr\": 0.04944901092973779,\n \"acc_norm\": 0.5242718446601942,\n \"acc_norm_stderr\": 0.04944901092973779\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n \"acc_stderr\": 0.02920254015343117,\n \"acc_norm\": 0.7264957264957265,\n \"acc_norm_stderr\": 0.02920254015343117\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.050241839379569095,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.050241839379569095\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5938697318007663,\n \"acc_stderr\": 0.017562037406478912,\n \"acc_norm\": 0.5938697318007663,\n \"acc_norm_stderr\": 0.017562037406478912\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.026454578146931498,\n \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.026454578146931498\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4405144694533762,\n \"acc_stderr\": 0.028196400574197426,\n \"acc_norm\": 0.4405144694533762,\n \"acc_norm_stderr\": 0.028196400574197426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32978723404255317,\n \"acc_stderr\": 0.0280459469420424,\n \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.0280459469420424\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3239895697522816,\n \"acc_stderr\": 0.011952840809646575,\n \"acc_norm\": 0.3239895697522816,\n \"acc_norm_stderr\": 0.011952840809646575\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.02643132987078953,\n \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.02643132987078953\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.019659922493623336,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.019659922493623336\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5522388059701493,\n \"acc_stderr\": 0.03516184772952166,\n \"acc_norm\": 0.5522388059701493,\n \"acc_norm_stderr\": 0.03516184772952166\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.0368078369072758,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.0368078369072758\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234214,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234214\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.5124400262486783,\n \"mc2_stderr\": 0.016311439508262023\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6187845303867403,\n \"acc_stderr\": 0.013650172164160318\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/PotatoOff/HamSter-0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-11-37.945575.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["**/details_harness|winogrande|5_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T21-11-37.945575.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T21_11_37.945575", "path": ["results_2024-01-13T21-11-37.945575.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T21-11-37.945575.parquet"]}]}]} | 2024-01-13T21:14:15+00:00 |
9bd92c6144993f93aad09bafd7cc1b040da9ce87 |
# Dataset Card for Evaluation run of shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO](https://huggingface.co/shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shitshow123__TinyLlama-1.1B-ChatStrong-DPO-PPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T21:14:09.825375](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__TinyLlama-1.1B-ChatStrong-DPO-PPO/blob/main/results_2024-01-13T21-14-09.825375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24211167185537147,
"acc_stderr": 0.030313687760954437,
"acc_norm": 0.24298283398639836,
"acc_norm_stderr": 0.031121580752328053,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156482,
"mc2": 0.48873713948713277,
"mc2_stderr": 0.016987991365255056
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.01235250704261739,
"acc_norm": 0.3037542662116041,
"acc_norm_stderr": 0.013438909184778757
},
"harness|hellaswag|10": {
"acc": 0.2568213503286198,
"acc_stderr": 0.004359871519639544,
"acc_norm": 0.2575184226249751,
"acc_norm_stderr": 0.004363736410689636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19148936170212766,
"acc_stderr": 0.025722149992637795,
"acc_norm": 0.19148936170212766,
"acc_norm_stderr": 0.025722149992637795
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.14482758620689656,
"acc_stderr": 0.0293272432693634,
"acc_norm": 0.14482758620689656,
"acc_norm_stderr": 0.0293272432693634
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795131,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795131
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.14,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031083,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031083
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.02824735012218027,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.02824735012218027
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.0228158130988966,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.0228158130988966
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.02921354941437216,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.02921354941437216
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.02813968944485967,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.02813968944485967
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.02845882099146029,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.02845882099146029
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19282511210762332,
"acc_stderr": 0.02647824096048936,
"acc_norm": 0.19282511210762332,
"acc_norm_stderr": 0.02647824096048936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19834710743801653,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.19834710743801653,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.1901840490797546,
"acc_stderr": 0.030833491146281245,
"acc_norm": 0.1901840490797546,
"acc_norm_stderr": 0.030833491146281245
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.042450224863844935,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.042450224863844935
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417604,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.014648172749593518,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.014648172749593518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.02259870380432164,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.02259870380432164
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225601,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225601
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.023152722439402307,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.023152722439402307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435122,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.022021366100220204,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.022021366100220204
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705477,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888461,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.017201662169789786,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.017201662169789786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.038313051408846034,
"acc_norm": 0.2,
"acc_norm_stderr": 0.038313051408846034
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156482,
"mc2": 0.48873713948713277,
"mc2_stderr": 0.016987991365255056
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076896
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_shitshow123__TinyLlama-1.1B-ChatStrong-DPO-PPO | [
"region:us"
] | 2024-01-13T21:15:56+00:00 | {"pretty_name": "Evaluation run of shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO](https://huggingface.co/shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shitshow123__TinyLlama-1.1B-ChatStrong-DPO-PPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T21:14:09.825375](https://huggingface.co/datasets/open-llm-leaderboard/details_shitshow123__TinyLlama-1.1B-ChatStrong-DPO-PPO/blob/main/results_2024-01-13T21-14-09.825375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24211167185537147,\n \"acc_stderr\": 0.030313687760954437,\n \"acc_norm\": 0.24298283398639836,\n \"acc_norm_stderr\": 0.031121580752328053,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156482,\n \"mc2\": 0.48873713948713277,\n \"mc2_stderr\": 0.016987991365255056\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.01235250704261739,\n \"acc_norm\": 0.3037542662116041,\n \"acc_norm_stderr\": 0.013438909184778757\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2568213503286198,\n \"acc_stderr\": 0.004359871519639544,\n \"acc_norm\": 0.2575184226249751,\n \"acc_norm_stderr\": 0.004363736410689636\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741713,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.19148936170212766,\n \"acc_stderr\": 0.025722149992637795,\n \"acc_norm\": 0.19148936170212766,\n \"acc_norm_stderr\": 0.025722149992637795\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.14482758620689656,\n \"acc_stderr\": 0.0293272432693634,\n \"acc_norm\": 0.14482758620689656,\n \"acc_norm_stderr\": 0.0293272432693634\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795131,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795131\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n \"acc_stderr\": 0.024362599693031083,\n \"acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.024362599693031083\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.02824735012218027,\n \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.02824735012218027\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198892,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198892\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.033088185944157494,\n \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.033088185944157494\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.0228158130988966,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.0228158130988966\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437216,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437216\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n \"acc_stderr\": 0.02813968944485967,\n \"acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.02813968944485967\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.02845882099146029,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.02845882099146029\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.19834710743801653,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.19834710743801653,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.1901840490797546,\n \"acc_stderr\": 0.030833491146281245,\n \"acc_norm\": 0.1901840490797546,\n \"acc_norm_stderr\": 0.030833491146281245\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.042450224863844935,\n \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.042450224863844935\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.027601921381417604,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.027601921381417604\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n \"acc_stderr\": 0.014648172749593518,\n \"acc_norm\": 0.21328224776500637,\n \"acc_norm_stderr\": 0.014648172749593518\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.02259870380432164,\n \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.02259870380432164\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225601,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225601\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.023152722439402307,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.023152722439402307\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n \"acc_stderr\": 0.023222756797435122,\n \"acc_norm\": 0.21221864951768488,\n \"acc_norm_stderr\": 0.023222756797435122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.022021366100220204,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.022021366100220204\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705477,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n \"acc_stderr\": 0.010792595553888461,\n \"acc_norm\": 0.23272490221642764,\n \"acc_norm_stderr\": 0.010792595553888461\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031218,\n \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031218\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789786,\n \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789786\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.038313051408846034,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.038313051408846034\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156482,\n \"mc2\": 0.48873713948713277,\n \"mc2_stderr\": 0.016987991365255056\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076896\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/shitshow123/TinyLlama-1.1B-ChatStrong-DPO-PPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-14-09.825375.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["**/details_harness|winogrande|5_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T21-14-09.825375.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T21_14_09.825375", "path": ["results_2024-01-13T21-14-09.825375.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T21-14-09.825375.parquet"]}]}]} | 2024-01-13T21:16:18+00:00 |
a63cb114c39e79b1a8a1803c98060e133d0c4b78 |
# Dataset of m3/M3/M3 (Girls' Frontline)
This is the dataset of m3/M3/M3 (Girls' Frontline), containing 18 images and their tags.
The core tags of this character are `blonde_hair, purple_eyes, short_hair, bangs, long_hair, breasts, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 22.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m3_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 12.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m3_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 42 | 26.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m3_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 19.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m3_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 42 | 37.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m3_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m3_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_shirt, simple_background, blush, closed_mouth, holding, long_sleeves, military_uniform, belt, black_necktie, collared_shirt, jacket, white_background, gun, skirt, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_shirt | simple_background | blush | closed_mouth | holding | long_sleeves | military_uniform | belt | black_necktie | collared_shirt | jacket | white_background | gun | skirt | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:--------------------|:--------|:---------------|:----------|:---------------|:-------------------|:-------|:----------------|:-----------------|:---------|:-------------------|:------|:--------|:--------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m3_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:21:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:25:53+00:00 |
eacf77babfbd7836443e1f3a7cbc0bacd23dac21 |
# Dataset of libeccio/リベッチオ/西南风 (Azur Lane)
This is the dataset of libeccio/リベッチオ/西南风 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `blue_eyes, white_hair, long_hair, hat, beret, breasts, bangs, multicolored_hair, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 32.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/libeccio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 17.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/libeccio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 32.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/libeccio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 28.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/libeccio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 49.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/libeccio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/libeccio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, long_sleeves, ahoge, bow, white_background, white_dress, closed_mouth, green_jacket, open_jacket, ribbon, black_footwear, green_headwear, hair_between_eyes, simple_background, small_breasts, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | smile | solo | long_sleeves | ahoge | bow | white_background | white_dress | closed_mouth | green_jacket | open_jacket | ribbon | black_footwear | green_headwear | hair_between_eyes | simple_background | small_breasts | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------|:-------|:---------------|:--------|:------|:-------------------|:--------------|:---------------|:---------------|:--------------|:---------|:-----------------|:-----------------|:--------------------|:--------------------|:----------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/libeccio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:22:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:29:27+00:00 |
5a286e3c175abcc6b3eab403159d0c5c97c20be5 |
# Dataset of agano/阿賀野/阿贺野 (Azur Lane)
This is the dataset of agano/阿賀野/阿贺野 (Azur Lane), containing 32 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, black_hair, bangs, large_breasts, very_long_hair, ponytail, hair_ornament, ahoge, bow, hair_bow, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 32 | 44.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agano_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 32 | 26.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agano_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 52.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agano_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 32 | 39.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agano_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 72.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/agano_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/agano_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, solo, looking_at_viewer, smile, blush, detached_sleeves, wide_sleeves, simple_background, black_pantyhose, black_skirt, kimono, obi, pleated_skirt, ribbon_trim, white_background, closed_mouth, open_mouth |
| 1 | 10 |  |  |  |  |  | blush, looking_at_viewer, 1girl, smile, solo, brown_coat, closed_mouth, black_pantyhose, hair_ribbon, turtleneck_sweater, aran_sweater, open_coat, sweater_dress, bag, holding, long_sleeves, red_ribbon, sleeveless_turtleneck, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | collarbone | solo | looking_at_viewer | smile | blush | detached_sleeves | wide_sleeves | simple_background | black_pantyhose | black_skirt | kimono | obi | pleated_skirt | ribbon_trim | white_background | closed_mouth | open_mouth | brown_coat | hair_ribbon | turtleneck_sweater | aran_sweater | open_coat | sweater_dress | bag | holding | long_sleeves | red_ribbon | sleeveless_turtleneck |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:-------------|:-------|:--------------------|:--------|:--------|:-------------------|:---------------|:--------------------|:------------------|:--------------|:---------|:------|:----------------|:--------------|:-------------------|:---------------|:-------------|:-------------|:--------------|:---------------------|:---------------|:------------|:----------------|:------|:----------|:---------------|:-------------|:------------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | | X | X | X | X | | | | X | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/agano_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:22:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:31:28+00:00 |
c75496c789dae6d7b425f6dd955657264322d76e | juansimonmolina/Voice_Model_Tests | [
"region:us"
] | 2024-01-13T21:30:51+00:00 | {} | 2024-01-13T21:47:03+00:00 |
|
46f17357a4b208ef9bfba1a9a489e85473b67892 | EduardoH/ibere | [
"license:openrail",
"region:us"
] | 2024-01-13T21:31:01+00:00 | {"license": "openrail"} | 2024-01-13T21:31:48+00:00 |
|
29b68f60e696038a887d5cd9ec70b75a44b07762 |
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.4](https://huggingface.co/andysalerno/openchat-nectar-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T21:31:07.575473](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4/blob/main/results_2024-01-13T21-31-07.575473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6548155823628143,
"acc_stderr": 0.03186572656130094,
"acc_norm": 0.6554787207794509,
"acc_norm_stderr": 0.0325212982695846,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5170655549832318,
"mc2_stderr": 0.015386835244454444
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.01410457836649189,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.01377868705417654
},
"harness|hellaswag|10": {
"acc": 0.6356303525194185,
"acc_stderr": 0.004802694106203655,
"acc_norm": 0.8323043218482374,
"acc_norm_stderr": 0.003728322968874899
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150878,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150878
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720893,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720893
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331154,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.02357688174400572,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.02357688174400572
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.0127686730761119,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.0127686730761119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377204,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5170655549832318,
"mc2_stderr": 0.015386835244454444
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.01086977863316837
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.01278268125105319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4 | [
"region:us"
] | 2024-01-13T21:33:22+00:00 | {"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.4](https://huggingface.co/andysalerno/openchat-nectar-0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T21:31:07.575473](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4/blob/main/results_2024-01-13T21-31-07.575473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6548155823628143,\n \"acc_stderr\": 0.03186572656130094,\n \"acc_norm\": 0.6554787207794509,\n \"acc_norm_stderr\": 0.0325212982695846,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5170655549832318,\n \"mc2_stderr\": 0.015386835244454444\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.01410457836649189,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.01377868705417654\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6356303525194185,\n \"acc_stderr\": 0.004802694106203655,\n \"acc_norm\": 0.8323043218482374,\n \"acc_norm_stderr\": 0.003728322968874899\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150878,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150878\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n \"acc_stderr\": 0.013182222616720893,\n \"acc_norm\": 0.8378033205619413,\n \"acc_norm_stderr\": 0.013182222616720893\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331154,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331154\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.02357688174400572,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.02357688174400572\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377204,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377204\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5170655549832318,\n \"mc2_stderr\": 0.015386835244454444\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.01086977863316837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.01278268125105319\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-31-07.575473.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["**/details_harness|winogrande|5_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T21-31-07.575473.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T21_31_07.575473", "path": ["results_2024-01-13T21-31-07.575473.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T21-31-07.575473.parquet"]}]}]} | 2024-01-13T21:33:43+00:00 |
4eada6230f32cc1c73061eca2675f36bb3e99184 |
An augmented and further cleaned version of [PIPPA-shareGPT](https://huggingface.co/datasets/kingbri/PIPPA-shareGPT) (specifically `pippa_sharegpt_trimmed.jsonl`, drawn from [PygmalianAI's PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA)) in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content).
- All {name} and {char} replaced by actual names and characters randomly generated by [Faker](https://pypi.org/project/Faker/).
- Very short conversations (<50 tokens) removed.
- Further de-duplicated, keeping the longest unique conversation.
- Conversations were made to be alternating (user/assistant), always starting with the user, and ending with the assistant. | grimulkan/PIPPA-augmented-dedup | [
"license:unknown",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:37:44+00:00 | {"license": "unknown", "tags": ["not-for-all-audiences"]} | 2024-01-24T00:00:39+00:00 |
aa2c305bdff163bfe1ec3e1b9aa461eeb2f1e019 |
math_qa converted to Python snippets | euclaise/mathqa_programs | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T21:42:41+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "options", "dtype": "string"}, {"name": "correct", "dtype": "string"}, {"name": "annotated_formula", "dtype": "string"}, {"name": "problem", "dtype": "string"}, {"name": "rationale", "dtype": "string"}, {"name": "program", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17017833, "num_examples": 28851}], "download_size": 8877888, "dataset_size": 17017833}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T21:46:00+00:00 |
73237d4b03c7fa06d0e5fe7725a0a478b2b7b776 |
# Dataset of foch/フォッシュ/福煦 (Azur Lane)
This is the dataset of foch/フォッシュ/福煦 (Azur Lane), containing 76 images and their tags.
The core tags of this character are `breasts, purple_hair, bangs, hair_between_eyes, multicolored_hair, long_hair, large_breasts, ahoge, crossed_bangs, grey_hair, red_eyes, pink_eyes, blue_hair, hair_ornament, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 76 | 113.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 76 | 55.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 177 | 116.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 76 | 95.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 177 | 175.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/foch_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/foch_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, white_leotard, blush, cowboy_shot, cross_hair_ornament, epaulettes, long_sleeves, looking_at_viewer, simple_background, standing, thighhighs, white_background, cape, groin, highleg, jacket, open_mouth, smile |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_shorts, cropped_sweater, looking_at_viewer, off-shoulder_sweater, official_alternate_costume, smile, solo, white_sweater, cowboy_shot, midriff, navel, open_mouth, white_background, blush, simple_background, two-tone_hair, bag, cleavage, long_sleeves, petals, standing |
| 2 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_shorts, collarbone, cropped_sweater, high-waist_shorts, looking_at_viewer, off-shoulder_sweater, official_alternate_costume, solo, standing, thigh_holster, white_sweater, blush, cleavage, handbag, sleeves_past_wrists, long_sleeves, parted_lips, shoulder_bag, smile, zipper_pull_tab, white_background, cowboy_shot, full_body, legs, shoes, two-tone_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_leotard | blush | cowboy_shot | cross_hair_ornament | epaulettes | long_sleeves | looking_at_viewer | simple_background | standing | thighhighs | white_background | cape | groin | highleg | jacket | open_mouth | smile | bare_shoulders | black_shorts | cropped_sweater | off-shoulder_sweater | official_alternate_costume | white_sweater | midriff | navel | two-tone_hair | bag | cleavage | petals | collarbone | high-waist_shorts | thigh_holster | handbag | sleeves_past_wrists | parted_lips | shoulder_bag | zipper_pull_tab | full_body | legs | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:--------|:--------------|:----------------------|:-------------|:---------------|:--------------------|:--------------------|:-----------|:-------------|:-------------------|:-------|:--------|:----------|:---------|:-------------|:--------|:-----------------|:---------------|:------------------|:-----------------------|:-----------------------------|:----------------|:----------|:--------|:----------------|:------|:-----------|:---------|:-------------|:--------------------|:----------------|:----------|:----------------------|:--------------|:---------------|:------------------|:------------|:-------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | X | | | X | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | | | X | X | | X | | X | | | | | | X | X | X | X | X | X | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/foch_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:42:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:02:48+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.