sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c7bab4f141276b4deb1a019116da03c0452c5b6e | Synthetic images with very low aesthetic scores from a variety of dataset, compressed to comical levels for training against. | Blackroot/Badly-Compressed-Jpeg-For-Anti-Artifact-Training | [
"region:us"
] | 2024-02-14T02:27:33+00:00 | {} | 2024-02-14T02:29:44+00:00 | [] | [] | TAGS
#region-us
| Synthetic images with very low aesthetic scores from a variety of dataset, compressed to comical levels for training against. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
4610867be33156f6ae7e2fb027b2c7adc7c37e7b |
# Dataset Card for Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ENERGY-DRINK-LOVE/SOLAR_merge2_dpo](https://huggingface.co/ENERGY-DRINK-LOVE/SOLAR_merge2_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ENERGY-DRINK-LOVE__SOLAR_merge2_dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T02:26:27.652870](https://huggingface.co/datasets/open-llm-leaderboard/details_ENERGY-DRINK-LOVE__SOLAR_merge2_dpo/blob/main/results_2024-02-14T02-26-27.652870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6448739833694188,
"acc_stderr": 0.032084108113789686,
"acc_norm": 0.6487452419330038,
"acc_norm_stderr": 0.03273096599243633,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5127603711978598,
"mc2_stderr": 0.014659099266296721
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719867,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.013990571137918763
},
"harness|hellaswag|10": {
"acc": 0.6279625572595101,
"acc_stderr": 0.004823604775015909,
"acc_norm": 0.8273252340171281,
"acc_norm_stderr": 0.003771934042799158
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361074,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361074
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4576719576719577,
"acc_stderr": 0.025658868862058336,
"acc_norm": 0.4576719576719577,
"acc_norm_stderr": 0.025658868862058336
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603613,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603613
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.038073871163060866,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.038073871163060866
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786744,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.01590143260893036,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.01590143260893036
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539967,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523363,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523363
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139969,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139969
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5127603711978598,
"mc2_stderr": 0.014659099266296721
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267198
},
"harness|gsm8k|5": {
"acc": 0.488248673237301,
"acc_stderr": 0.0137686804081428
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ENERGY-DRINK-LOVE__SOLAR_merge2_dpo | [
"region:us"
] | 2024-02-14T02:28:43+00:00 | {"pretty_name": "Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [ENERGY-DRINK-LOVE/SOLAR_merge2_dpo](https://huggingface.co/ENERGY-DRINK-LOVE/SOLAR_merge2_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ENERGY-DRINK-LOVE__SOLAR_merge2_dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T02:26:27.652870](https://huggingface.co/datasets/open-llm-leaderboard/details_ENERGY-DRINK-LOVE__SOLAR_merge2_dpo/blob/main/results_2024-02-14T02-26-27.652870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6448739833694188,\n \"acc_stderr\": 0.032084108113789686,\n \"acc_norm\": 0.6487452419330038,\n \"acc_norm_stderr\": 0.03273096599243633,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5127603711978598,\n \"mc2_stderr\": 0.014659099266296721\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719867,\n \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.013990571137918763\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6279625572595101,\n \"acc_stderr\": 0.004823604775015909,\n \"acc_norm\": 0.8273252340171281,\n \"acc_norm_stderr\": 0.003771934042799158\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361074,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361074\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4576719576719577,\n \"acc_stderr\": 0.025658868862058336,\n \"acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.025658868862058336\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603613,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603613\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.038073871163060866,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.038073871163060866\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n \"acc_stderr\": 0.01590143260893036,\n \"acc_norm\": 0.3452513966480447,\n \"acc_norm_stderr\": 0.01590143260893036\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539967,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n \"acc_stderr\": 0.012756161942523363,\n \"acc_norm\": 0.4765319426336376,\n \"acc_norm_stderr\": 0.012756161942523363\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139969,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139969\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5127603711978598,\n \"mc2_stderr\": 0.014659099266296721\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.488248673237301,\n \"acc_stderr\": 0.0137686804081428\n }\n}\n```", "repo_url": "https://huggingface.co/ENERGY-DRINK-LOVE/SOLAR_merge2_dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|arc:challenge|25_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|gsm8k|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hellaswag|10_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T02-26-27.652870.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["**/details_harness|winogrande|5_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T02-26-27.652870.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T02_26_27.652870", "path": ["results_2024-02-14T02-26-27.652870.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T02-26-27.652870.parquet"]}]}]} | 2024-02-14T02:29:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo
Dataset automatically created during the evaluation run of model ENERGY-DRINK-LOVE/SOLAR_merge2_dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T02:26:27.652870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo\n\n\n\nDataset automatically created during the evaluation run of model ENERGY-DRINK-LOVE/SOLAR_merge2_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T02:26:27.652870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo\n\n\n\nDataset automatically created during the evaluation run of model ENERGY-DRINK-LOVE/SOLAR_merge2_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T02:26:27.652870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ENERGY-DRINK-LOVE/SOLAR_merge2_dpo\n\n\n\nDataset automatically created during the evaluation run of model ENERGY-DRINK-LOVE/SOLAR_merge2_dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T02:26:27.652870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
5aa2cbeb0e9173c686f14381fa7633188b04f1d5 | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids: []
paperswithcode_id: samsum-corpus
pretty_name: SAMSum Corpus
tags:
- conversations-summarization
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
config_name: samsum
splits:
- name: train
num_bytes: 9479141
num_examples: 14732
- name: test
num_bytes: 534492
num_examples: 819
- name: validation
num_bytes: 516431
num_examples: 818
download_size: 2944100
dataset_size: 10530064
train-eval-index:
- config: samsum
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
dialogue: text
summary: target
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | longAtSJSU/TrainData | [
"task_categories:text-classification",
"language:en",
"license:other",
"region:us"
] | 2024-02-14T02:37:14+00:00 | {"language": ["en"], "license": "other", "task_categories": ["text-classification"], "pretty_name": "resever"} | 2024-02-15T05:02:23+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #language-English #license-other #region-us
| ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids: []
paperswithcode_id: samsum-corpus
pretty_name: SAMSum Corpus
tags:
- conversations-summarization
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
config_name: samsum
splits:
- name: train
num_bytes: 9479141
num_examples: 14732
- name: test
num_bytes: 534492
num_examples: 819
- name: validation
num_bytes: 516431
num_examples: 818
download_size: 2944100
dataset_size: 10530064
train-eval-index:
- config: samsum
task: summarization
task_id: summarization
splits:
eval_split: test
col_mapping:
dialogue: text
summary: target
---
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-text-classification #language-English #license-other #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
26,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#task_categories-text-classification #language-English #license-other #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
e17f257af3805acff9f49f4b59a92be049812940 |
## Overview
This dataset features a curated collection of questions and answers synthesized to cover key topics in Frontend development. Topics include HTML. CSS, JS, React JS, Next JS in a circullum manner.
## Caution
This dataset was generated using Bard, please note that some content may not be entirely precise or reflect expert consensus.
Users are encouraged to verify information independently for scholarly or critical purposes. | Tensoic/FrontendCookbook | [
"task_categories:text-generation",
"language:en",
"license:apache-2.0",
"code",
"react",
"nextjs",
"frontend",
"region:us"
] | 2024-02-14T02:41:14+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 91175739, "num_examples": 45448}], "download_size": 33773424, "dataset_size": 91175739}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code", "react", "nextjs", "frontend"]} | 2024-02-14T02:55:23+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #license-apache-2.0 #code #react #nextjs #frontend #region-us
|
## Overview
This dataset features a curated collection of questions and answers synthesized to cover key topics in Frontend development. Topics include HTML. CSS, JS, React JS, Next JS in a circullum manner.
## Caution
This dataset was generated using Bard, please note that some content may not be entirely precise or reflect expert consensus.
Users are encouraged to verify information independently for scholarly or critical purposes. | [
"## Overview\nThis dataset features a curated collection of questions and answers synthesized to cover key topics in Frontend development. Topics include HTML. CSS, JS, React JS, Next JS in a circullum manner.",
"## Caution\n\nThis dataset was generated using Bard, please note that some content may not be entirely precise or reflect expert consensus. \nUsers are encouraged to verify information independently for scholarly or critical purposes."
] | [
"TAGS\n#task_categories-text-generation #language-English #license-apache-2.0 #code #react #nextjs #frontend #region-us \n",
"## Overview\nThis dataset features a curated collection of questions and answers synthesized to cover key topics in Frontend development. Topics include HTML. CSS, JS, React JS, Next JS in a circullum manner.",
"## Caution\n\nThis dataset was generated using Bard, please note that some content may not be entirely precise or reflect expert consensus. \nUsers are encouraged to verify information independently for scholarly or critical purposes."
] | [
41,
54,
48
] | [
"passage: TAGS\n#task_categories-text-generation #language-English #license-apache-2.0 #code #react #nextjs #frontend #region-us \n## Overview\nThis dataset features a curated collection of questions and answers synthesized to cover key topics in Frontend development. Topics include HTML. CSS, JS, React JS, Next JS in a circullum manner.## Caution\n\nThis dataset was generated using Bard, please note that some content may not be entirely precise or reflect expert consensus. \nUsers are encouraged to verify information independently for scholarly or critical purposes."
] |
cd26e106b4c23f0ddd39e746986d8fffe2e8d1ec |
## Description
Footage of all flags of the world!
## Model
SVD
## Tags
- Flags
## Voice
Julian
## Music
balearic deep house music
## Prompt
A video channel about flags
| IsraelJordan1/ai-tube-flags | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-14T02:53:55+00:00 | {"license": "cc-by-nc-4.0", "pretty_name": "FlagsWorld"} | 2024-02-14T03:13:26+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
|
## Description
Footage of all flags of the world!
## Model
SVD
## Tags
- Flags
## Voice
Julian
## Music
balearic deep house music
## Prompt
A video channel about flags
| [
"## Description\n\n Footage of all flags of the world!",
"## Model\n\nSVD",
"## Tags\n\n- Flags",
"## Voice\n\nJulian",
"## Music\n\nbalearic deep house music",
"## Prompt\n\nA video channel about flags"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n\n Footage of all flags of the world!",
"## Model\n\nSVD",
"## Tags\n\n- Flags",
"## Voice\n\nJulian",
"## Music\n\nbalearic deep house music",
"## Prompt\n\nA video channel about flags"
] | [
17,
12,
4,
5,
3,
8,
10
] | [
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n## Description\n\n Footage of all flags of the world!## Model\n\nSVD## Tags\n\n- Flags## Voice\n\nJulian## Music\n\nbalearic deep house music## Prompt\n\nA video channel about flags"
] |
db835b9e09621548ed12219f3dc5fb9c433cd800 |
# Dr. Nicefollows's Worry Free General Chat Dataset v1
## Overview
This dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.
In this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.
## Dataset Format
The dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.
## Volume
The dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.
## Licensing
Our dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.
## Use Case
This dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:
datasets:
- path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1
- type: completion
## Feeling Generous? 😊
Eager to buy me a cup of 2$ coffe or iced tea?🍵☕ Sure, here is the link: [https://ko-fi.com/drnicefellow](https://ko-fi.com/drnicefellow). Please add a note on which one you want me to drink? | DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1 | [
"license:apache-2.0",
"region:us"
] | 2024-02-14T03:41:21+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 37678642, "num_examples": 21982}], "download_size": 16914098, "dataset_size": 37678642}} | 2024-02-14T04:02:31+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# Dr. Nicefollows's Worry Free General Chat Dataset v1
## Overview
This dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.
In this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.
## Dataset Format
The dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.
## Volume
The dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.
## Licensing
Our dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.
## Use Case
This dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:
datasets:
- path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1
- type: completion
## Feeling Generous?
Eager to buy me a cup of 2$ coffe or iced tea? Sure, here is the link: URL Please add a note on which one you want me to drink? | [
"# Dr. Nicefollows's Worry Free General Chat Dataset v1",
"## Overview\nThis dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.\nIn this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.",
"## Dataset Format\nThe dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.",
"## Volume\nThe dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.",
"## Licensing\nOur dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.",
"## Use Case\nThis dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:\ndatasets:\n - path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1\n - type: completion",
"## Feeling Generous? \nEager to buy me a cup of 2$ coffe or iced tea? Sure, here is the link: URL Please add a note on which one you want me to drink?"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Dr. Nicefollows's Worry Free General Chat Dataset v1",
"## Overview\nThis dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.\nIn this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.",
"## Dataset Format\nThe dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.",
"## Volume\nThe dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.",
"## Licensing\nOur dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.",
"## Use Case\nThis dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:\ndatasets:\n - path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1\n - type: completion",
"## Feeling Generous? \nEager to buy me a cup of 2$ coffe or iced tea? Sure, here is the link: URL Please add a note on which one you want me to drink?"
] | [
14,
17,
116,
50,
36,
79,
106,
45
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Dr. Nicefollows's Worry Free General Chat Dataset v1## Overview\nThis dataset contains high-quality general chat samples questions and answers. It is designed following the LIMA: Less Is More for Alignment principle from MetaAI: emphasizing the importance of quality over quantity in training data. Despite its modest size, the dataset's quality ensures its effectiveness in training and fine-tuning conversational AI models.\nIn this version, each chat has one user query and assistant answer. In the next version, it will become a conversation of multiple rounds.## Dataset Format\nThe dataset is structured in the Vicuna 1.1 format, featuring one-round chats. This format is chosen for its compatibility with various conversational AI training paradigms and its efficiency in representing dialogues.## Volume\nThe dataset comprises a few thousand chat samples. Each sample has been carefully curated to ensure the highest quality, aligning with the LIMA principle.## Licensing\nOur dataset is worry-free regarding proprietary issues, as it is not automatically generated by a proprietary chatbot. This dataset is released under the Apache License 2.0. This license allows for broad freedom in usage and modification, provided that proper credit is given and changes are documented. For full license terms, please refer to the LICENSE file.## Use Case\nThis dataset is ideal for training conversational AI models. It can help in developing chatbots or virtual assistants capable of handling a wide range of queries with high accuracy. To use the dataset for finetuning a model with Axolotl, simply add the following to the .yml file:\ndatasets:\n - path: DrNicefellow/Quality_WorryFree_GeneralQA_Chat_Dataset-v1\n - type: completion## Feeling Generous? \nEager to buy me a cup of 2$ coffe or iced tea? Sure, here is the link: URL Please add a note on which one you want me to drink?"
] |
fbf408c04b15bab9cf0f831d80fe4ff774d88e8a | # Dataset Card for ECInstruct
ECInstruct comprises 10 tasks, including attribute value extraction, product relation prediction,
product matching, sentiment analysis, sequential recommendation, multiclass product classification, product
substitute identification, query product rank, answerability prediction, and answer generation.
ECInstruct is split into training sets, validation sets, in-domain (IND)
test sets, and out-of-domain (OOD) test sets.
## Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub](https://github.com/ninglab/eCeLLM)
- **Homepage:** [eCeLLM](https://ninglab.github.io/eCeLLM/)
## Data Split
The statistic of the ECInstruct Dataset is shown in the table below.
| Split | Size |
| --- | --- |
| Train | 92,022 |
| Validation | 9,253 |
| Test_IND | 9,253 |
| Test_OOD | 6,000 |
| Total | 116,528 |
## Usage
As detailed in the paper,
for each task, we could conduct training and evaluation under multiple settings.
For example, <code>setting = IND_Diverse_Instruction, task = Answer_Generation</code> indicates
the training set for learning models on the answer generation task with diverse instructions for the IND test set.
## License
Please check the license of each subset in our curated dataset ECInstruct.
| Dataset | Liscence Type |
| --- | --- |
| Amazon-Google Products | CC-by-4.0 |
| Amazon Review | Non listed |
| AmazonQA | Non listed |
| Shopping Queries Dataset | Apache License 2.0 |
## Citation
```bibtex
@misc{peng2024ecellm,
title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
year={2024},
eprint={2402.08831},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | NingLab/ECInstruct | [
"task_categories:text-classification",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:feature-extraction",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-4.0",
"Large Language Models",
"arxiv:2402.08831",
"region:us"
] | 2024-02-14T03:54:30+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification", "question-answering", "zero-shot-classification", "feature-extraction", "text-generation"], "tags": ["Large Language Models"]} | 2024-02-15T05:18:14+00:00 | [
"2402.08831"
] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-feature-extraction #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-4.0 #Large Language Models #arxiv-2402.08831 #region-us
| Dataset Card for ECInstruct
===========================
ECInstruct comprises 10 tasks, including attribute value extraction, product relation prediction,
product matching, sentiment analysis, sequential recommendation, multiclass product classification, product
substitute identification, query product rank, answerability prediction, and answer generation.
ECInstruct is split into training sets, validation sets, in-domain (IND)
test sets, and out-of-domain (OOD) test sets.
Dataset Sources
---------------
* Repository: GitHub
* Homepage: eCeLLM
Data Split
----------
The statistic of the ECInstruct Dataset is shown in the table below.
Usage
-----
As detailed in the paper,
for each task, we could conduct training and evaluation under multiple settings.
For example, `setting = IND_Diverse_Instruction, task = Answer_Generation` indicates
the training set for learning models on the answer generation task with diverse instructions for the IND test set.
License
-------
Please check the license of each subset in our curated dataset ECInstruct.
| [] | [
"TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-feature-extraction #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-4.0 #Large Language Models #arxiv-2402.08831 #region-us \n"
] | [
105
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-feature-extraction #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-cc-by-4.0 #Large Language Models #arxiv-2402.08831 #region-us \n"
] |
736b9e9758e5618575e2ce4d72e609e61688a36e |
### Reference:
- "A Question-Entailment Approach to Question Answering". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019. | joseagmz/MedQuad-MedicalQnADataset | [
"task_categories:question-answering",
"task_categories:text2text-generation",
"region:us"
] | 2024-02-14T04:13:31+00:00 | {"task_categories": ["question-answering", "text2text-generation"], "pretty_name": "MedQuad-KV"} | 2024-02-14T04:33:38+00:00 | [] | [] | TAGS
#task_categories-question-answering #task_categories-text2text-generation #region-us
|
### Reference:
- "A Question-Entailment Approach to Question Answering". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019. | [
"### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] | [
"TAGS\n#task_categories-question-answering #task_categories-text2text-generation #region-us \n",
"### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] | [
31,
41
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-text2text-generation #region-us \n### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] |
f0a734ca4d836a385b7005e38519d187c705a935 |
# Dataset Card for Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach](https://huggingface.co/dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dddsaty__FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T04:17:47.999729](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach/blob/main/results_2024-02-14T04-17-47.999729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6570390402660917,
"acc_stderr": 0.0319899764012956,
"acc_norm": 0.6558538110941428,
"acc_norm_stderr": 0.03267458805939838,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7124230121816638,
"mc2_stderr": 0.01476530310042609
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473836
},
"harness|hellaswag|10": {
"acc": 0.7220673172674766,
"acc_stderr": 0.004470644845242895,
"acc_norm": 0.8893646683927504,
"acc_norm_stderr": 0.0031303894668332005
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653344,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653344
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7124230121816638,
"mc2_stderr": 0.01476530310042609
},
"harness|winogrande|5": {
"acc": 0.8760852407261247,
"acc_stderr": 0.009260146295063706
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dddsaty__FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach | [
"region:us"
] | 2024-02-14T04:20:04+00:00 | {"pretty_name": "Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach", "dataset_summary": "Dataset automatically created during the evaluation run of model [dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach](https://huggingface.co/dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dddsaty__FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T04:17:47.999729](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach/blob/main/results_2024-02-14T04-17-47.999729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6570390402660917,\n \"acc_stderr\": 0.0319899764012956,\n \"acc_norm\": 0.6558538110941428,\n \"acc_norm_stderr\": 0.03267458805939838,\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7124230121816638,\n \"mc2_stderr\": 0.01476530310042609\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473836\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7220673172674766,\n \"acc_stderr\": 0.004470644845242895,\n \"acc_norm\": 0.8893646683927504,\n \"acc_norm_stderr\": 0.0031303894668332005\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653344,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653344\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7124230121816638,\n \"mc2_stderr\": 0.01476530310042609\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8760852407261247,\n \"acc_stderr\": 0.009260146295063706\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873358\n }\n}\n```", "repo_url": "https://huggingface.co/dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-17-47.999729.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["**/details_harness|winogrande|5_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T04-17-47.999729.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T04_17_47.999729", "path": ["results_2024-02-14T04-17-47.999729.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T04-17-47.999729.parquet"]}]}]} | 2024-02-14T04:20:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach
Dataset automatically created during the evaluation run of model dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T04:17:47.999729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:17:47.999729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:17:47.999729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
217,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/FusionNet_7Bx2_MoE_Ko_DPO_Adapter_Attach on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T04:17:47.999729(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
dcd14d5fe88e6f41f88a7996ad34f0599cb64371 |
The entire codegolf stackexchange where questions have a score above 0, 14K code questions with all the answers
- good for learning complex code questions, more unique challenges, code optimizations, and code not really mainstream, could help diversity | VatsaDev/codegolf | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"code",
"challege",
"codegolf",
"region:us"
] | 2024-02-14T04:28:22+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "Codegolf", "tags": ["code", "challege", "codegolf"]} | 2024-02-17T04:50:02+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #code #challege #codegolf #region-us
|
The entire codegolf stackexchange where questions have a score above 0, 14K code questions with all the answers
- good for learning complex code questions, more unique challenges, code optimizations, and code not really mainstream, could help diversity | [] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #code #challege #codegolf #region-us \n"
] | [
46
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #code #challege #codegolf #region-us \n"
] |
d4004f96e9e84e4163fd47beea09c2c17cdd7c81 |
# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Prima-Pastacles-7b-128k](https://huggingface.co/Test157t/Prima-Pastacles-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T04:48:26.646856](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b-128k/blob/main/results_2024-02-14T04-48-26.646856.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.648455875293252,
"acc_stderr": 0.03217096547257116,
"acc_norm": 0.6501400436143484,
"acc_norm_stderr": 0.0328192833219117,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6250924624380715,
"mc2_stderr": 0.015327084138857437
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756558,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173311
},
"harness|hellaswag|10": {
"acc": 0.69398526190002,
"acc_stderr": 0.004598940722374087,
"acc_norm": 0.86566421031667,
"acc_norm_stderr": 0.003403158010309544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137605,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137605
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6250924624380715,
"mc2_stderr": 0.015327084138857437
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989247
},
"harness|gsm8k|5": {
"acc": 0.5936315390447309,
"acc_stderr": 0.013528846685413239
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b-128k | [
"region:us"
] | 2024-02-14T04:50:42+00:00 | {"pretty_name": "Evaluation run of Test157t/Prima-Pastacles-7b-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/Prima-Pastacles-7b-128k](https://huggingface.co/Test157t/Prima-Pastacles-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T04:48:26.646856](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b-128k/blob/main/results_2024-02-14T04-48-26.646856.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.648455875293252,\n \"acc_stderr\": 0.03217096547257116,\n \"acc_norm\": 0.6501400436143484,\n \"acc_norm_stderr\": 0.0328192833219117,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6250924624380715,\n \"mc2_stderr\": 0.015327084138857437\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756558,\n \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173311\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.69398526190002,\n \"acc_stderr\": 0.004598940722374087,\n \"acc_norm\": 0.86566421031667,\n \"acc_norm_stderr\": 0.003403158010309544\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137605,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137605\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093085,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093085\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6250924624380715,\n \"mc2_stderr\": 0.015327084138857437\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5936315390447309,\n \"acc_stderr\": 0.013528846685413239\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/Prima-Pastacles-7b-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-48-26.646856.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["**/details_harness|winogrande|5_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T04-48-26.646856.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T04_48_26.646856", "path": ["results_2024-02-14T04-48-26.646856.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T04-48-26.646856.parquet"]}]}]} | 2024-02-14T04:51:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b-128k
Dataset automatically created during the evaluation run of model Test157t/Prima-Pastacles-7b-128k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T04:48:26.646856(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Prima-Pastacles-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:48:26.646856(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Prima-Pastacles-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:48:26.646856(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Prima-Pastacles-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T04:48:26.646856(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
649191ecb27f4f6a2586724190256789e161949a |
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_sgall](https://huggingface.co/Lvxy1117/amber_fine_tune_sgall) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T04:49:00.115070](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall/blob/main/results_2024-02-14T04-49-00.115070.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.32013909097742504,
"acc_stderr": 0.032715298552530025,
"acc_norm": 0.3224755631334577,
"acc_norm_stderr": 0.03349829796565176,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.4047870475782831,
"mc2_stderr": 0.014878403265738149
},
"harness|arc:challenge|25": {
"acc": 0.40955631399317405,
"acc_stderr": 0.014370358632472446,
"acc_norm": 0.44283276450511944,
"acc_norm_stderr": 0.014515573873348902
},
"harness|hellaswag|10": {
"acc": 0.5653256323441546,
"acc_stderr": 0.004947010937455345,
"acc_norm": 0.7476598287193786,
"acc_norm_stderr": 0.004334676952703862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438648,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438648
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.02218203720294836,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.02218203720294836
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.03499807276193337,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.03499807276193337
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204423,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204423
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3174311926605505,
"acc_stderr": 0.0199571521984605,
"acc_norm": 0.3174311926605505,
"acc_norm_stderr": 0.0199571521984605
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.35443037974683544,
"acc_stderr": 0.031137304297185805,
"acc_norm": 0.35443037974683544,
"acc_norm_stderr": 0.031137304297185805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.43946188340807174,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.43946188340807174,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.42735042735042733,
"acc_stderr": 0.03240847393516326,
"acc_norm": 0.42735042735042733,
"acc_norm_stderr": 0.03240847393516326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.01757070523925654,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.01757070523925654
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.33236994219653176,
"acc_stderr": 0.025361168749688228,
"acc_norm": 0.33236994219653176,
"acc_norm_stderr": 0.025361168749688228
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.01414957534897627,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.01414957534897627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615686,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.38271604938271603,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.38271604938271603,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2653194263363755,
"acc_stderr": 0.011276198843958878,
"acc_norm": 0.2653194263363755,
"acc_norm_stderr": 0.011276198843958878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.0276784686421447,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.0276784686421447
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.018926082916083393,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.018926082916083393
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538422,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538422
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.35323383084577115,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.35323383084577115,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4502923976608187,
"acc_stderr": 0.03815827365913235,
"acc_norm": 0.4502923976608187,
"acc_norm_stderr": 0.03815827365913235
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.01527417621928336,
"mc2": 0.4047870475782831,
"mc2_stderr": 0.014878403265738149
},
"harness|winogrande|5": {
"acc": 0.6748224151539068,
"acc_stderr": 0.013165525471764361
},
"harness|gsm8k|5": {
"acc": 0.043214556482183475,
"acc_stderr": 0.005600987515237868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall | [
"region:us"
] | 2024-02-14T04:50:46+00:00 | {"pretty_name": "Evaluation run of Lvxy1117/amber_fine_tune_sgall", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_sgall](https://huggingface.co/Lvxy1117/amber_fine_tune_sgall) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T04:49:00.115070](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_sgall/blob/main/results_2024-02-14T04-49-00.115070.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.32013909097742504,\n \"acc_stderr\": 0.032715298552530025,\n \"acc_norm\": 0.3224755631334577,\n \"acc_norm_stderr\": 0.03349829796565176,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.4047870475782831,\n \"mc2_stderr\": 0.014878403265738149\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.40955631399317405,\n \"acc_stderr\": 0.014370358632472446,\n \"acc_norm\": 0.44283276450511944,\n \"acc_norm_stderr\": 0.014515573873348902\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5653256323441546,\n \"acc_stderr\": 0.004947010937455345,\n \"acc_norm\": 0.7476598287193786,\n \"acc_norm_stderr\": 0.004334676952703862\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438648,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438648\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.039215453124671215,\n \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.039215453124671215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.02218203720294836,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.02218203720294836\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031596,\n \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031596\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.03499807276193337,\n \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.03499807276193337\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204423,\n \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204423\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3174311926605505,\n \"acc_stderr\": 0.0199571521984605,\n \"acc_norm\": 0.3174311926605505,\n \"acc_norm_stderr\": 0.0199571521984605\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3284313725490196,\n \"acc_stderr\": 0.03296245110172229,\n \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.03296245110172229\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.35443037974683544,\n \"acc_stderr\": 0.031137304297185805,\n \"acc_norm\": 0.35443037974683544,\n \"acc_norm_stderr\": 0.031137304297185805\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.43946188340807174,\n \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.43946188340807174,\n \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.42735042735042733,\n \"acc_stderr\": 0.03240847393516326,\n \"acc_norm\": 0.42735042735042733,\n \"acc_norm_stderr\": 0.03240847393516326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.01757070523925654,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.01757070523925654\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.33236994219653176,\n \"acc_stderr\": 0.025361168749688228,\n \"acc_norm\": 0.33236994219653176,\n \"acc_norm_stderr\": 0.025361168749688228\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n \"acc_stderr\": 0.01414957534897627,\n \"acc_norm\": 0.2335195530726257,\n \"acc_norm_stderr\": 0.01414957534897627\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615686,\n \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615686\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.38271604938271603,\n \"acc_stderr\": 0.027044538138402605,\n \"acc_norm\": 0.38271604938271603,\n \"acc_norm_stderr\": 0.027044538138402605\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2653194263363755,\n \"acc_stderr\": 0.011276198843958878,\n \"acc_norm\": 0.2653194263363755,\n \"acc_norm_stderr\": 0.011276198843958878\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.0276784686421447,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.0276784686421447\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.018926082916083393,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.018926082916083393\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538422,\n \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538422\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.35323383084577115,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.35323383084577115,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4502923976608187,\n \"acc_stderr\": 0.03815827365913235,\n \"acc_norm\": 0.4502923976608187,\n \"acc_norm_stderr\": 0.03815827365913235\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.01527417621928336,\n \"mc2\": 0.4047870475782831,\n \"mc2_stderr\": 0.014878403265738149\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6748224151539068,\n \"acc_stderr\": 0.013165525471764361\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \"acc_stderr\": 0.005600987515237868\n }\n}\n```", "repo_url": "https://huggingface.co/Lvxy1117/amber_fine_tune_sgall", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["**/details_harness|winogrande|5_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T04-49-00.115070.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T04_49_00.115070", "path": ["results_2024-02-14T04-49-00.115070.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T04-49-00.115070.parquet"]}]}]} | 2024-02-14T04:51:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall
Dataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_sgall on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T04:49:00.115070(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall\n\n\n\nDataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_sgall on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:49:00.115070(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall\n\n\n\nDataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_sgall on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T04:49:00.115070(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_sgall\n\n\n\nDataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_sgall on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T04:49:00.115070(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
00d38a454cb5c1498ec68b07697abbcb5398663e |
# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Pasta-PrimaMaid-7b](https://huggingface.co/Test157t/Pasta-PrimaMaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:01:29.583377](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b/blob/main/results_2024-02-14T05-01-29.583377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.634263599311872,
"acc_stderr": 0.03256128363764394,
"acc_norm": 0.6378018463193035,
"acc_norm_stderr": 0.03320777359975318,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6646812404946975,
"mc2_stderr": 0.0151510838946931
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859857,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946535
},
"harness|hellaswag|10": {
"acc": 0.6888070105556662,
"acc_stderr": 0.004620353433075613,
"acc_norm": 0.8617805218084047,
"acc_norm_stderr": 0.0034442484997916595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.024162780284017724,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.024162780284017724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465725,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465725
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190442,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6646812404946975,
"mc2_stderr": 0.0151510838946931
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.49128127369219105,
"acc_stderr": 0.013770390697002107
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b | [
"region:us"
] | 2024-02-14T05:03:47+00:00 | {"pretty_name": "Evaluation run of Test157t/Pasta-PrimaMaid-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/Pasta-PrimaMaid-7b](https://huggingface.co/Test157t/Pasta-PrimaMaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T05:01:29.583377](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b/blob/main/results_2024-02-14T05-01-29.583377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.634263599311872,\n \"acc_stderr\": 0.03256128363764394,\n \"acc_norm\": 0.6378018463193035,\n \"acc_norm_stderr\": 0.03320777359975318,\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6646812404946975,\n \"mc2_stderr\": 0.0151510838946931\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859857,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946535\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6888070105556662,\n \"acc_stderr\": 0.004620353433075613,\n \"acc_norm\": 0.8617805218084047,\n \"acc_norm_stderr\": 0.0034442484997916595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.024162780284017724,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.024162780284017724\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465725,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465725\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.012732398286190442,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.012732398286190442\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6646812404946975,\n \"mc2_stderr\": 0.0151510838946931\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.49128127369219105,\n \"acc_stderr\": 0.013770390697002107\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/Pasta-PrimaMaid-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["**/details_harness|winogrande|5_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T05-01-29.583377.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T05_01_29.583377", "path": ["results_2024-02-14T05-01-29.583377.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T05-01-29.583377.parquet"]}]}]} | 2024-02-14T05:04:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b
Dataset automatically created during the evaluation run of model Test157t/Pasta-PrimaMaid-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T05:01:29.583377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Pasta-PrimaMaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:01:29.583377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Pasta-PrimaMaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:01:29.583377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Pasta-PrimaMaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T05:01:29.583377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
718108bd26a2bf5517735f0e1c4feadb99d331ea |
# Dataset Card for Evaluation run of Test157t/Kunocchini-7b-128k-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-7b-128k-test](https://huggingface.co/Test157t/Kunocchini-7b-128k-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Kunocchini-7b-128k-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:06:44.993569](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b-128k-test/blob/main/results_2024-02-14T05-06-44.993569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6152034757840662,
"acc_stderr": 0.0329247622811643,
"acc_norm": 0.6177868811795422,
"acc_norm_stderr": 0.03358361519821421,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5935169882880537,
"mc2_stderr": 0.015805393188351592
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.01412459788184446,
"acc_norm": 0.6697952218430034,
"acc_norm_stderr": 0.013743085603760424
},
"harness|hellaswag|10": {
"acc": 0.6738697470623382,
"acc_stderr": 0.004678375103797961,
"acc_norm": 0.8562039434375622,
"acc_norm_stderr": 0.0035016571073867102
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397457,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876164
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906508,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906508
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.03468343295111126,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.03468343295111126
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5935169882880537,
"mc2_stderr": 0.015805393188351592
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.5231235784685367,
"acc_stderr": 0.013757748544245331
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__Kunocchini-7b-128k-test | [
"region:us"
] | 2024-02-14T05:09:04+00:00 | {"pretty_name": "Evaluation run of Test157t/Kunocchini-7b-128k-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-7b-128k-test](https://huggingface.co/Test157t/Kunocchini-7b-128k-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Kunocchini-7b-128k-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T05:06:44.993569](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b-128k-test/blob/main/results_2024-02-14T05-06-44.993569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6152034757840662,\n \"acc_stderr\": 0.0329247622811643,\n \"acc_norm\": 0.6177868811795422,\n \"acc_norm_stderr\": 0.03358361519821421,\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5935169882880537,\n \"mc2_stderr\": 0.015805393188351592\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.01412459788184446,\n \"acc_norm\": 0.6697952218430034,\n \"acc_norm_stderr\": 0.013743085603760424\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6738697470623382,\n \"acc_stderr\": 0.004678375103797961,\n \"acc_norm\": 0.8562039434375622,\n \"acc_norm_stderr\": 0.0035016571073867102\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091095,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876164,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876164\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906508,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906508\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n \"acc_stderr\": 0.03468343295111126,\n \"acc_norm\": 0.5970149253731343,\n \"acc_norm_stderr\": 0.03468343295111126\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5935169882880537,\n \"mc2_stderr\": 0.015805393188351592\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5231235784685367,\n \"acc_stderr\": 0.013757748544245331\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/Kunocchini-7b-128k-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-06-44.993569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["**/details_harness|winogrande|5_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T05-06-44.993569.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T05_06_44.993569", "path": ["results_2024-02-14T05-06-44.993569.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T05-06-44.993569.parquet"]}]}]} | 2024-02-14T05:09:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/Kunocchini-7b-128k-test
Dataset automatically created during the evaluation run of model Test157t/Kunocchini-7b-128k-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T05:06:44.993569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/Kunocchini-7b-128k-test\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b-128k-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:06:44.993569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/Kunocchini-7b-128k-test\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b-128k-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:06:44.993569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/Kunocchini-7b-128k-test\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b-128k-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T05:06:44.993569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
4bef3a08e00f95027b38648a4f2bb99bc0233102 |
### Reference:
- "A Question-Entailment Approach to Question Answering". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019. | joseagmz/MedQnA_version3 | [
"task_categories:question-answering",
"task_categories:text2text-generation",
"region:us"
] | 2024-02-14T05:10:26+00:00 | {"task_categories": ["question-answering", "text2text-generation"], "pretty_name": "MedQuad-KV"} | 2024-02-14T05:12:41+00:00 | [] | [] | TAGS
#task_categories-question-answering #task_categories-text2text-generation #region-us
|
### Reference:
- "A Question-Entailment Approach to Question Answering". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019. | [
"### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] | [
"TAGS\n#task_categories-question-answering #task_categories-text2text-generation #region-us \n",
"### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] | [
31,
41
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-text2text-generation #region-us \n### Reference:\n- \"A Question-Entailment Approach to Question Answering\". Asma Ben Abacha and Dina Demner-Fushman. BMC Bioinformatics, 2019."
] |
d8d93579f690f5ec9fac8dbc1f9b2a6a60533912 |
# Dataset Card for Evaluation run of Test157t/Kunocchini-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-7b](https://huggingface.co/Test157t/Kunocchini-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Kunocchini-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:12:00.698748](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b/blob/main/results_2024-02-14T05-12-00.698748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6398558023445645,
"acc_stderr": 0.03250448541726538,
"acc_norm": 0.6434251517105837,
"acc_norm_stderr": 0.03315134190382154,
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6862179059406133,
"mc2_stderr": 0.015230575859702986
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600935,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.7058354909380602,
"acc_stderr": 0.00454735017928625,
"acc_norm": 0.8684524995020912,
"acc_norm_stderr": 0.003373073863582292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073354,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438778,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438778
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5140758873929009,
"mc1_stderr": 0.01749656371704278,
"mc2": 0.6862179059406133,
"mc2_stderr": 0.015230575859702986
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089696
},
"harness|gsm8k|5": {
"acc": 0.4783927217589083,
"acc_stderr": 0.013759618667051771
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__Kunocchini-7b | [
"region:us"
] | 2024-02-14T05:14:17+00:00 | {"pretty_name": "Evaluation run of Test157t/Kunocchini-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-7b](https://huggingface.co/Test157t/Kunocchini-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Kunocchini-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T05:12:00.698748](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-7b/blob/main/results_2024-02-14T05-12-00.698748.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6398558023445645,\n \"acc_stderr\": 0.03250448541726538,\n \"acc_norm\": 0.6434251517105837,\n \"acc_norm_stderr\": 0.03315134190382154,\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6862179059406133,\n \"mc2_stderr\": 0.015230575859702986\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600935,\n \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7058354909380602,\n \"acc_stderr\": 0.00454735017928625,\n \"acc_norm\": 0.8684524995020912,\n \"acc_norm_stderr\": 0.003373073863582292\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073354,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n \"acc_stderr\": 0.016646914804438778,\n \"acc_norm\": 0.45251396648044695,\n \"acc_norm_stderr\": 0.016646914804438778\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675592,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675592\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5140758873929009,\n \"mc1_stderr\": 0.01749656371704278,\n \"mc2\": 0.6862179059406133,\n \"mc2_stderr\": 0.015230575859702986\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4783927217589083,\n \"acc_stderr\": 0.013759618667051771\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/Kunocchini-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["**/details_harness|winogrande|5_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T05-12-00.698748.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T05_12_00.698748", "path": ["results_2024-02-14T05-12-00.698748.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T05-12-00.698748.parquet"]}]}]} | 2024-02-14T05:14:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/Kunocchini-7b
Dataset automatically created during the evaluation run of model Test157t/Kunocchini-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T05:12:00.698748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/Kunocchini-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:12:00.698748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/Kunocchini-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:12:00.698748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/Kunocchini-7b\n\n\n\nDataset automatically created during the evaluation run of model Test157t/Kunocchini-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T05:12:00.698748(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
a7fe141153d1be62484f422f51f7b5ee88e349be |
# Dataset Card for Evaluation run of Test157t/HerculeanSea-upd-7b-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/HerculeanSea-upd-7b-128k](https://huggingface.co/Test157t/HerculeanSea-upd-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__HerculeanSea-upd-7b-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:19:01.826771](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__HerculeanSea-upd-7b-128k/blob/main/results_2024-02-14T05-19-01.826771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471081791375651,
"acc_stderr": 0.03218704971437464,
"acc_norm": 0.6487492641334992,
"acc_norm_stderr": 0.032838210995230224,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5553625933753323,
"mc2_stderr": 0.015274123807763034
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192589,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.6729735112527385,
"acc_stderr": 0.004681682605347882,
"acc_norm": 0.8588926508663612,
"acc_norm_stderr": 0.0034742076834003467
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5553625933753323,
"mc2_stderr": 0.015274123807763034
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.013437829864668582
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__HerculeanSea-upd-7b-128k | [
"region:us"
] | 2024-02-14T05:21:21+00:00 | {"pretty_name": "Evaluation run of Test157t/HerculeanSea-upd-7b-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/HerculeanSea-upd-7b-128k](https://huggingface.co/Test157t/HerculeanSea-upd-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__HerculeanSea-upd-7b-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T05:19:01.826771](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__HerculeanSea-upd-7b-128k/blob/main/results_2024-02-14T05-19-01.826771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471081791375651,\n \"acc_stderr\": 0.03218704971437464,\n \"acc_norm\": 0.6487492641334992,\n \"acc_norm_stderr\": 0.032838210995230224,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5553625933753323,\n \"mc2_stderr\": 0.015274123807763034\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192589,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6729735112527385,\n \"acc_stderr\": 0.004681682605347882,\n \"acc_norm\": 0.8588926508663612,\n \"acc_norm_stderr\": 0.0034742076834003467\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5553625933753323,\n \"mc2_stderr\": 0.015274123807763034\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \"acc_stderr\": 0.013437829864668582\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/HerculeanSea-upd-7b-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-19-01.826771.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["**/details_harness|winogrande|5_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T05-19-01.826771.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T05_19_01.826771", "path": ["results_2024-02-14T05-19-01.826771.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T05-19-01.826771.parquet"]}]}]} | 2024-02-14T05:21:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/HerculeanSea-upd-7b-128k
Dataset automatically created during the evaluation run of model Test157t/HerculeanSea-upd-7b-128k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T05:19:01.826771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/HerculeanSea-upd-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-upd-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:19:01.826771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/HerculeanSea-upd-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-upd-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:19:01.826771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/HerculeanSea-upd-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-upd-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T05:19:01.826771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
6bada851bb0972ee1951eb59e765a441dd9a67ba |
# Dataset Card for Evaluation run of Test157t/HerculeanSea-7b-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/HerculeanSea-7b-128k](https://huggingface.co/Test157t/HerculeanSea-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__HerculeanSea-7b-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:24:52.371385](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__HerculeanSea-7b-128k/blob/main/results_2024-02-14T05-24-52.371385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6446866682807924,
"acc_stderr": 0.03218286079263721,
"acc_norm": 0.646797548216542,
"acc_norm_stderr": 0.03282960918625195,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.557687745131288,
"mc2_stderr": 0.015253545548350879
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.01416336689619259,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283512
},
"harness|hellaswag|10": {
"acc": 0.6708822943636725,
"acc_stderr": 0.004689324696186877,
"acc_norm": 0.8579964150567616,
"acc_norm_stderr": 0.0034834044902359923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851112,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796126,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796126
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669975,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669975
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.557687745131288,
"mc2_stderr": 0.015253545548350879
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.5837755875663382,
"acc_stderr": 0.013577788334652657
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Test157t__HerculeanSea-7b-128k | [
"region:us"
] | 2024-02-14T05:27:12+00:00 | {"pretty_name": "Evaluation run of Test157t/HerculeanSea-7b-128k", "dataset_summary": "Dataset automatically created during the evaluation run of model [Test157t/HerculeanSea-7b-128k](https://huggingface.co/Test157t/HerculeanSea-7b-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__HerculeanSea-7b-128k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T05:24:52.371385](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__HerculeanSea-7b-128k/blob/main/results_2024-02-14T05-24-52.371385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6446866682807924,\n \"acc_stderr\": 0.03218286079263721,\n \"acc_norm\": 0.646797548216542,\n \"acc_norm_stderr\": 0.03282960918625195,\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.557687745131288,\n \"mc2_stderr\": 0.015253545548350879\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.01416336689619259,\n \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283512\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6708822943636725,\n \"acc_stderr\": 0.004689324696186877,\n \"acc_norm\": 0.8579964150567616,\n \"acc_norm_stderr\": 0.0034834044902359923\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532337,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532337\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n \"acc_stderr\": 0.016303899530796126,\n \"acc_norm\": 0.3888268156424581,\n \"acc_norm_stderr\": 0.016303899530796126\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669975,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669975\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.557687745131288,\n \"mc2_stderr\": 0.015253545548350879\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5837755875663382,\n \"acc_stderr\": 0.013577788334652657\n }\n}\n```", "repo_url": "https://huggingface.co/Test157t/HerculeanSea-7b-128k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T05-24-52.371385.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["**/details_harness|winogrande|5_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T05-24-52.371385.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T05_24_52.371385", "path": ["results_2024-02-14T05-24-52.371385.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T05-24-52.371385.parquet"]}]}]} | 2024-02-14T05:27:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Test157t/HerculeanSea-7b-128k
Dataset automatically created during the evaluation run of model Test157t/HerculeanSea-7b-128k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T05:24:52.371385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Test157t/HerculeanSea-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:24:52.371385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Test157t/HerculeanSea-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T05:24:52.371385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Test157t/HerculeanSea-7b-128k\n\n\n\nDataset automatically created during the evaluation run of model Test157t/HerculeanSea-7b-128k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T05:24:52.371385(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
caf7db6218b56e2e2a0fc5ed340022ce91fcebdc | # Dataset Card for "gaze-following-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tiennv/gaze-following-test | [
"region:us"
] | 2024-02-14T05:27:26+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "split", "dtype": "string"}, {"name": "width", "dtype": "int64"}, {"name": "height", "dtype": "int64"}, {"name": "bboxes", "dtype": "string"}, {"name": "labels", "dtype": "string"}, {"name": "cab", "dtype": "int64"}, {"name": "hum", "dtype": "int64"}, {"name": "light", "dtype": "float64"}, {"name": "cam", "dtype": "int64"}, {"name": "env", "dtype": "int64"}, {"name": "gaze_item", "dtype": "int64"}, {"name": "gazeIdx", "dtype": "int64"}, {"name": "gaze_cx", "dtype": "int64"}, {"name": "gaze_cy", "dtype": "int64"}, {"name": "hx", "dtype": "int64"}, {"name": "hy", "dtype": "int64"}, {"name": "pitch", "dtype": "float64"}, {"name": "yaw", "dtype": "float64"}, {"name": "roll", "dtype": "float64"}, {"name": "seg", "dtype": "string"}, {"name": "segm_gazeIdx", "dtype": "int64"}, {"name": "occluded", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 11133726929.8, "num_examples": 19200}], "download_size": 11101174289, "dataset_size": 11133726929.8}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T05:50:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gaze-following-test"
More Information needed | [
"# Dataset Card for \"gaze-following-test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gaze-following-test\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gaze-following-test\"\n\nMore Information needed"
] |
d55a944e4aa3716683fdbf5898ca91e1e717a403 |
## Introduction
We release the annotated data used in [Dissecting Human and LLM Preferences](https://arxiv.org/abs/).
*Original Dataset* - The dataset is based on [lmsys/chatbot_arena_conversations](https://huggingface.co/datasets/lmsys/chatbot_arena_conversations), which contains 33K cleaned conversations with pairwise human preferences collected from 13K unique IP addresses on the [Chatbot Arena](https://lmsys.org/blog/2023-05-03-arena/) from April to June 2023.
*Filtering and Scenario-wise Sampling* - We filter out the conversations that are not in English, with "Tie" or "Both Bad" labels, and the multi-turn conversations. We first sample 400 samples with unsafe queries according to the OpenAI moderation API tags and the additional toxic tags in the original dataset, then we apply [Auto-J's scenario classifier](https://huggingface.co/GAIR/autoj-scenario-classifier) to determine the scenario of each sample (we merge the Auto-J's scenarios into 10 new ones). For the *Knowledge-aware* and *Others* scenarios, we pick 820 samples, and for the other scenarios, we pick 400 samples. The total number is 5,240.
*Collecting Preferences* - Besides the human preference labels in this original dataset, we also collect the binary preference labels from 32 LLMs, including 2 proprietary LLMs and 30 open-source ones.
*Annotation on Defined Properties* - We define a set of 29 properties, we annotate how each property is satisfied (in Likert scale rating or property-specific annotation) in all responses ($5,240\times 2=10,480$). See our paper for more details of the defined properties.
## Dataset Overview
An example of the json format is as follows:
```json
{
"query": "...",
"scenario_auto-j": "...",
"scenario_group": "...",
"response_1": {
"content": "...",
"model": "...",
"num_words": "..."
},
"response_2": {...},
"gpt-4-turbo_reference": "...",
"clear intent": "Yes/No",
"explicitly express feelings": "Yes/No",
"explicit constraints": [
...
],
"explicit subjective stances": [
...
],
"explicit mistakes or biases": [
...
],
"preference_labels": {
"human": "response_1/response_2",
"gpt-4-turbo": "response_1/response_2",
...
},
"basic_response_1": {
"admit limitations or mistakes": 0/1/2/3,
"authoritative tone": 0/1/2/3,
...
},
"basic_response_2": {...},
"errors_response_1": {
"applicable or not": "applicable/not applicable",
"errors":[
{
"brief description": "...",
"severity": "severe/moderate/minor",
"type": "...",
},
...
]
},
"errors_response_2": {...},
"query-specific_response_1": {
"clarify user intent": ...,
"correcting explicit mistakes or biases": None,
"satisfying explicit constraints": [
...
],
"showing empathetic": [
...
],
"supporting explicit subjective stances": [
...
]
},
"query-specific_response_2": {...}
}
```
The following fields are basic information:
- **query**: The user query.
- **scenario_auto-j**: The scenario classified by Auto-J's classifier.
- **scenario_group**: One of the 10 new scenarios we merged from the Auto-J's scenario, including an *Unsafe Query* scenario.
- **response_1/response_2**: The content of a response:
- **content**: The text content.
- **model**: The model that generate this response.
- **num_words**: The number of words of this response, determined by NLTK.
- **gpt-4-turbo_reference**: An reference response generated by GPT-4-Turbo.
The following fields are Query-Specific prerequisites. For the last three, there may be an empty list if there is no constraints/stances/mistakes.
- **clear intent**: Whether the intent of the user is clearly expressed in the query, "Yes" or "No".
- **explicitly express feelings**: Whether the user clearly express his/her feelings or emotions in the query, "Yes" or "No".
- **explicit constraints**": A list containing all the explicit constraints in the query.
- **explicit subjective stances**: A list containing all the subjective stances in the query.
- **explicit mistakes or biases**: A list containing all the mistakes or biases in the query.
The following fields are the main body of the annotation.
- **preference_labels**: The preference label for each judge (human or an LLM) indicating which response is preferred in a pair, "response_1/response_2".
- **basic_response_1/basic_response_2**: The annotated ratings of the 20 basic properties (except *lengthy*) for the response.
- **property_name**: 0/1/2/3
- ...
- **errors_response_1/errors_response_2**: The detected errors of the response.
- **applicable or not**: If GPT-4-Turbo find itself can reliably detect the errors in the response.
- **errors**: A list containing the detected errors in the response.
- **brief description**: A brief description of the error.
- **severity**: How much the error affect the overall correctness of the response, "severe/moderate/minor".
- **type**: The type of the error, "factual error/information contradiction to the query/math operation error/code generation error"
- **query-specific_response_1/query-specific_response_2**: The annotation results of the Query-Specific properties.
- **clarify user intent**: If the user intent is not clear, rate how much the response help clarify the intent, 0/1/2/3.
- **showing empathetic**: If the user expresses feelings or emotions, rate how much the response show empathetic, 0/1/2/3.
- **satisfying explicit constraints**: If there are explicit constraints in the query, rate how much the response satisfy each of them.
- A list of "{description of constraint} | 0/1/2/3"
- **correcting explicit mistakes or biases**: If there are mistakes of biases in the query, classify how the response correct each of them
- A list of "{description of mistake} | Pointed out and corrected/Pointed out but not corrected/Corrected without being pointed out/Neither pointed out nor corrected"
- **supporting explicit subjective stances**: If there are subject stances in the query, classify how the response support each of them
- A list of "{description of stance} | Strongly supported/Weakly supported/Neutral/Weakly opposed/Strongly opposed"
## Statistics
👇 Number of samples meeting 5 Query-specific prerequisites.
| Prerequisite | # | Prerequisite | # |
| ------------------------- | ----- | ---------------- | ---- |
| with explicit constraints | 1,418 | unclear intent | 459 |
| show subjective stances | 388 | express feelings | 121 |
| contain mistakes or bias | 401 | | |
👇 Mean Score/Count for each property in collected data. *The average scores of 5 query-specific properties are calculated only on samples where the queries met specific prerequisites.
| Property | Mean Score/Count | Property | Mean Score/Count |
| ------------------------ | ---------------- | ------------------------ | ---------------- |
| **Mean Score** | | | |
| harmless | 2.90 | persuasive | 0.27 |
| grammarly correct | 2.70 | step-by-step | 0.37 |
| friendly | 1.79 | use informal expressions | 0.04 |
| polite | 2.78 | clear | 2.54 |
| interactive | 0.22 | contain rich information | 1.74 |
| authoritative | 1.67 | novel | 0.47 |
| funny | 0.08 | relevant | 2.45 |
| use rhetorical devices | 0.16 | clarify intent* | 1.33 |
| complex word & sentence | 0.89 | show empathetic* | 1.48 |
| use supporting materials | 0.13 | satisfy constraints* | 2.01 |
| well formatted | 1.26 | support stances* | 2.28 |
| admit limits | 0.17 | correct mistakes* | 1.08 |
| **Mean Count** | | | |
| severe errors | 0.59 | minor errors | 0.23 |
| moderate errors | 0.61 | length | 164.52 |
👇 Property correlation in the annotated data.
<img src="./property_corr.PNG" alt="image-20240213145030747" style="zoom: 50%;" />
## Disclaimers and Terms
**This part is copied from the original dataset*
- **This dataset contains conversations that may be considered unsafe, offensive, or upsetting.** It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.
- Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.
- Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.
- Users of this data should adhere to the terms of use for a specific model when using its direct outputs.
- Users of this data agree to not attempt to determine the identity of individuals in this dataset.
## License
Following the original dataset, this dataset is licensed under CC-BY-NC-4.0.
## Citation
```
``` | Anonymousxx/preference-dissection | [
"language:en",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-14T05:41:35+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "pretty_name": "Preference Dissection", "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "scenario_auto-j", "dtype": "string"}, {"name": "scenario_group", "dtype": "string"}, {"name": "response_1", "struct": [{"name": "content", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "num_words", "dtype": "int64"}]}, {"name": "response_2", "struct": [{"name": "content", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "num_words", "dtype": "int64"}]}, {"name": "gpt-4-turbo_reference", "dtype": "string"}, {"name": "clear intent", "dtype": "string"}, {"name": "explicitly express feelings", "dtype": "string"}, {"name": "explicit constraints", "sequence": "string"}, {"name": "explicit subjective stances", "sequence": "string"}, {"name": "explicit mistakes or biases", "sequence": "string"}, {"name": "preference_labels", "struct": [{"name": "gpt-3.5-turbo-1106", "dtype": "string"}, {"name": "gpt-4-1106-preview", "dtype": "string"}, {"name": "human", "dtype": "string"}, {"name": "llama-2-13b", "dtype": "string"}, {"name": "llama-2-13b-chat", "dtype": "string"}, {"name": "llama-2-70b", "dtype": "string"}, {"name": "llama-2-70b-chat", "dtype": "string"}, {"name": "llama-2-7b", "dtype": "string"}, {"name": "llama-2-7b-chat", "dtype": "string"}, {"name": "mistral-7b", "dtype": "string"}, {"name": "mistral-7b-instruct-v0.1", "dtype": "string"}, {"name": "mistral-7b-instruct-v0.2", "dtype": "string"}, {"name": "mistral-8x7b", "dtype": "string"}, {"name": "mistral-8x7b-instruct-v0.1", "dtype": "string"}, {"name": "qwen-14b", "dtype": "string"}, {"name": "qwen-14b-chat", "dtype": "string"}, {"name": "qwen-72b", "dtype": "string"}, {"name": "qwen-72b-chat", "dtype": "string"}, {"name": "qwen-7b", "dtype": "string"}, {"name": "qwen-7b-chat", "dtype": "string"}, {"name": "tulu-2-dpo-13b", "dtype": "string"}, {"name": "tulu-2-dpo-70b", "dtype": "string"}, {"name": "tulu-2-dpo-7b", "dtype": "string"}, {"name": "vicuna-13b-v1.5", "dtype": "string"}, {"name": "vicuna-7b-v1.5", "dtype": "string"}, {"name": "wizardLM-13b-v1.2", "dtype": "string"}, {"name": "wizardLM-70b-v1.0", "dtype": "string"}, {"name": "yi-34b", "dtype": "string"}, {"name": "yi-34b-chat", "dtype": "string"}, {"name": "yi-6b", "dtype": "string"}, {"name": "yi-6b-chat", "dtype": "string"}, {"name": "zephyr-7b-alpha", "dtype": "string"}, {"name": "zephyr-7b-beta", "dtype": "string"}]}, {"name": "basic_response_1", "struct": [{"name": "admit limitations or mistakes", "dtype": "int64"}, {"name": "authoritative tone", "dtype": "int64"}, {"name": "clear and understandable", "dtype": "int64"}, {"name": "complex word usage and sentence structure", "dtype": "int64"}, {"name": "friendly", "dtype": "int64"}, {"name": "funny and humorous", "dtype": "int64"}, {"name": "grammar, spelling, punctuation, and code-switching", "dtype": "int64"}, {"name": "harmlessness", "dtype": "int64"}, {"name": "information richness without considering inaccuracy", "dtype": "int64"}, {"name": "innovative and novel", "dtype": "int64"}, {"name": "interactive", "dtype": "int64"}, {"name": "metaphors, personification, similes, hyperboles, irony, parallelism", "dtype": "int64"}, {"name": "persuade user", "dtype": "int64"}, {"name": "polite", "dtype": "int64"}, {"name": "relevance without considering inaccuracy", "dtype": "int64"}, {"name": "repetitive", "dtype": "int64"}, {"name": "step by step solution", "dtype": "int64"}, {"name": "use of direct and explicit supporting materials", "dtype": "int64"}, {"name": "use of informal expressions", "dtype": "int64"}, {"name": "well formatted", "dtype": "int64"}]}, {"name": "basic_response_2", "struct": [{"name": "admit limitations or mistakes", "dtype": "int64"}, {"name": "authoritative tone", "dtype": "int64"}, {"name": "clear and understandable", "dtype": "int64"}, {"name": "complex word usage and sentence structure", "dtype": "int64"}, {"name": "friendly", "dtype": "int64"}, {"name": "funny and humorous", "dtype": "int64"}, {"name": "grammar, spelling, punctuation, and code-switching", "dtype": "int64"}, {"name": "harmlessness", "dtype": "int64"}, {"name": "information richness without considering inaccuracy", "dtype": "int64"}, {"name": "innovative and novel", "dtype": "int64"}, {"name": "interactive", "dtype": "int64"}, {"name": "metaphors, personification, similes, hyperboles, irony, parallelism", "dtype": "int64"}, {"name": "persuade user", "dtype": "int64"}, {"name": "polite", "dtype": "int64"}, {"name": "relevance without considering inaccuracy", "dtype": "int64"}, {"name": "repetitive", "dtype": "int64"}, {"name": "step by step solution", "dtype": "int64"}, {"name": "use of direct and explicit supporting materials", "dtype": "int64"}, {"name": "use of informal expressions", "dtype": "int64"}, {"name": "well formatted", "dtype": "int64"}]}, {"name": "errors_response_1", "struct": [{"name": "applicable or not", "dtype": "string"}, {"name": "errors", "list": [{"name": "brief description", "dtype": "string"}, {"name": "severity", "dtype": "string"}, {"name": "type", "dtype": "string"}]}]}, {"name": "errors_response_2", "struct": [{"name": "applicable or not", "dtype": "string"}, {"name": "errors", "list": [{"name": "brief description", "dtype": "string"}, {"name": "severity", "dtype": "string"}, {"name": "type", "dtype": "string"}]}]}, {"name": "query-specific_response_1", "struct": [{"name": "clarify user intent", "dtype": "int64"}, {"name": "correcting explicit mistakes or biases", "sequence": "string"}, {"name": "satisfying explicit constraints", "sequence": "string"}, {"name": "showing empathetic", "dtype": "int64"}, {"name": "supporting explicit subjective stances", "sequence": "string"}]}, {"name": "query-specific_response_2", "struct": [{"name": "clarify user intent", "dtype": "int64"}, {"name": "correcting explicit mistakes or biases", "sequence": "string"}, {"name": "satisfying explicit constraints", "sequence": "string"}, {"name": "showing empathetic", "dtype": "int64"}, {"name": "supporting explicit subjective stances", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 27617371, "num_examples": 5240}], "download_size": 13124269, "dataset_size": 27617371}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T05:46:10+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-nc-4.0 #region-us
| Introduction
------------
We release the annotated data used in Dissecting Human and LLM Preferences.
*Original Dataset* - The dataset is based on lmsys/chatbot\_arena\_conversations, which contains 33K cleaned conversations with pairwise human preferences collected from 13K unique IP addresses on the Chatbot Arena from April to June 2023.
*Filtering and Scenario-wise Sampling* - We filter out the conversations that are not in English, with "Tie" or "Both Bad" labels, and the multi-turn conversations. We first sample 400 samples with unsafe queries according to the OpenAI moderation API tags and the additional toxic tags in the original dataset, then we apply Auto-J's scenario classifier to determine the scenario of each sample (we merge the Auto-J's scenarios into 10 new ones). For the *Knowledge-aware* and *Others* scenarios, we pick 820 samples, and for the other scenarios, we pick 400 samples. The total number is 5,240.
*Collecting Preferences* - Besides the human preference labels in this original dataset, we also collect the binary preference labels from 32 LLMs, including 2 proprietary LLMs and 30 open-source ones.
*Annotation on Defined Properties* - We define a set of 29 properties, we annotate how each property is satisfied (in Likert scale rating or property-specific annotation) in all responses ($5,240\times 2=10,480$). See our paper for more details of the defined properties.
Dataset Overview
----------------
An example of the json format is as follows:
The following fields are basic information:
* query: The user query.
* scenario\_auto-j: The scenario classified by Auto-J's classifier.
* scenario\_group: One of the 10 new scenarios we merged from the Auto-J's scenario, including an *Unsafe Query* scenario.
* response\_1/response\_2: The content of a response:
+ content: The text content.
+ model: The model that generate this response.
+ num\_words: The number of words of this response, determined by NLTK.
* gpt-4-turbo\_reference: An reference response generated by GPT-4-Turbo.
The following fields are Query-Specific prerequisites. For the last three, there may be an empty list if there is no constraints/stances/mistakes.
* clear intent: Whether the intent of the user is clearly expressed in the query, "Yes" or "No".
* explicitly express feelings: Whether the user clearly express his/her feelings or emotions in the query, "Yes" or "No".
* explicit constraints": A list containing all the explicit constraints in the query.
* explicit subjective stances: A list containing all the subjective stances in the query.
* explicit mistakes or biases: A list containing all the mistakes or biases in the query.
The following fields are the main body of the annotation.
* preference\_labels: The preference label for each judge (human or an LLM) indicating which response is preferred in a pair, "response\_1/response\_2".
* basic\_response\_1/basic\_response\_2: The annotated ratings of the 20 basic properties (except *lengthy*) for the response.
+ property\_name: 0/1/2/3
+ ...
* errors\_response\_1/errors\_response\_2: The detected errors of the response.
+ applicable or not: If GPT-4-Turbo find itself can reliably detect the errors in the response.
+ errors: A list containing the detected errors in the response.
- brief description: A brief description of the error.
- severity: How much the error affect the overall correctness of the response, "severe/moderate/minor".
- type: The type of the error, "factual error/information contradiction to the query/math operation error/code generation error"
* query-specific\_response\_1/query-specific\_response\_2: The annotation results of the Query-Specific properties.
+ clarify user intent: If the user intent is not clear, rate how much the response help clarify the intent, 0/1/2/3.
+ showing empathetic: If the user expresses feelings or emotions, rate how much the response show empathetic, 0/1/2/3.
+ satisfying explicit constraints: If there are explicit constraints in the query, rate how much the response satisfy each of them.
- A list of "{description of constraint} | 0/1/2/3"
+ correcting explicit mistakes or biases: If there are mistakes of biases in the query, classify how the response correct each of them
- A list of "{description of mistake} | Pointed out and corrected/Pointed out but not corrected/Corrected without being pointed out/Neither pointed out nor corrected"
+ supporting explicit subjective stances: If there are subject stances in the query, classify how the response support each of them
- A list of "{description of stance} | Strongly supported/Weakly supported/Neutral/Weakly opposed/Strongly opposed"
Statistics
----------
Number of samples meeting 5 Query-specific prerequisites.
Mean Score/Count for each property in collected data. \*The average scores of 5 query-specific properties are calculated only on samples where the queries met specific prerequisites.
Property correlation in the annotated data.

Disclaimers and Terms
---------------------
This part is copied from the original dataset\*
* This dataset contains conversations that may be considered unsafe, offensive, or upsetting. It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.
* Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.
* Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.
* Users of this data should adhere to the terms of use for a specific model when using its direct outputs.
* Users of this data agree to not attempt to determine the identity of individuals in this dataset.
License
-------
Following the original dataset, this dataset is licensed under CC-BY-NC-4.0.
| [] | [
"TAGS\n#language-English #license-cc-by-nc-4.0 #region-us \n"
] | [
21
] | [
"passage: TAGS\n#language-English #license-cc-by-nc-4.0 #region-us \n"
] |
01ab3c69f6601d42307951aed2846c7cc533c4e7 |
# Dataset Card for Evaluation run of FelixChao/Scorpio-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Scorpio-7B](https://huggingface.co/FelixChao/Scorpio-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Scorpio-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T06:06:41.057848](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Scorpio-7B/blob/main/results_2024-02-14T06-06-41.057848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652720861598439,
"acc_stderr": 0.03206963180616428,
"acc_norm": 0.6521299642533669,
"acc_norm_stderr": 0.03274061096321565,
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7250768130559095,
"mc2_stderr": 0.014525579582282969
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623494,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274783
},
"harness|hellaswag|10": {
"acc": 0.6979685321649074,
"acc_stderr": 0.004582004744713377,
"acc_norm": 0.8849830711013742,
"acc_norm_stderr": 0.0031839033919416975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337128,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337128
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525818,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504512,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504512
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7250768130559095,
"mc2_stderr": 0.014525579582282969
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237431
},
"harness|gsm8k|5": {
"acc": 0.7187263078089462,
"acc_stderr": 0.012384789310940244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Scorpio-7B | [
"region:us"
] | 2024-02-14T06:08:57+00:00 | {"pretty_name": "Evaluation run of FelixChao/Scorpio-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Scorpio-7B](https://huggingface.co/FelixChao/Scorpio-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Scorpio-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T06:06:41.057848](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Scorpio-7B/blob/main/results_2024-02-14T06-06-41.057848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652720861598439,\n \"acc_stderr\": 0.03206963180616428,\n \"acc_norm\": 0.6521299642533669,\n \"acc_norm_stderr\": 0.03274061096321565,\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7250768130559095,\n \"mc2_stderr\": 0.014525579582282969\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623494,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274783\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6979685321649074,\n \"acc_stderr\": 0.004582004744713377,\n \"acc_norm\": 0.8849830711013742,\n \"acc_norm_stderr\": 0.0031839033919416975\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337128,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337128\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509986,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509986\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525818,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525818\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135107,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504512,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504512\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7250768130559095,\n \"mc2_stderr\": 0.014525579582282969\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237431\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \"acc_stderr\": 0.012384789310940244\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Scorpio-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|arc:challenge|25_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|gsm8k|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hellaswag|10_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T06-06-41.057848.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["**/details_harness|winogrande|5_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T06-06-41.057848.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T06_06_41.057848", "path": ["results_2024-02-14T06-06-41.057848.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T06-06-41.057848.parquet"]}]}]} | 2024-02-14T06:09:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Scorpio-7B
Dataset automatically created during the evaluation run of model FelixChao/Scorpio-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T06:06:41.057848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Scorpio-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Scorpio-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T06:06:41.057848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Scorpio-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Scorpio-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T06:06:41.057848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/Scorpio-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Scorpio-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T06:06:41.057848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
accc435e10f25ede3df8ad39095f6bed9b3fbe93 | # Dataset Card for "ControlLM_Personalities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | WENGSYX/ControlLM_Personalities | [
"region:us"
] | 2024-02-14T06:17:25+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer_matching_behavior", "dtype": "string"}, {"name": "answer_not_matching_behavior", "dtype": "string"}], "splits": [{"name": "Extraversion", "num_bytes": 18884, "num_examples": 148}, {"name": "Neuroticism", "num_bytes": 12503, "num_examples": 98}, {"name": "Conscientiousness", "num_bytes": 20532, "num_examples": 164}, {"name": "Agreeableness", "num_bytes": 18952, "num_examples": 148}, {"name": "Openness", "num_bytes": 13828, "num_examples": 108}, {"name": "Obsequiousness", "num_bytes": 11949, "num_examples": 100}], "download_size": 34419, "dataset_size": 96648}} | 2024-02-14T16:12:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ControlLM_Personalities"
More Information needed | [
"# Dataset Card for \"ControlLM_Personalities\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ControlLM_Personalities\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ControlLM_Personalities\"\n\nMore Information needed"
] |
568a1ed1f00b86edf0cd9550a27f36b6f28df89e | # Dataset Card for "high_vs_random_with_med_low_min_100_issues_per_repo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kristmh/high_vs_random_with_med_low_min_100_issues_per_repo | [
"region:us"
] | 2024-02-14T07:31:13+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validate", "path": "data/validate-*"}]}], "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "text_clean", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 12471198, "num_examples": 11676}, {"name": "train", "num_bytes": 102559638, "num_examples": 93401}, {"name": "validate", "num_bytes": 12809891, "num_examples": 11675}], "download_size": 60396205, "dataset_size": 127840727}} | 2024-02-14T07:31:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "high_vs_random_with_med_low_min_100_issues_per_repo"
More Information needed | [
"# Dataset Card for \"high_vs_random_with_med_low_min_100_issues_per_repo\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"high_vs_random_with_med_low_min_100_issues_per_repo\"\n\nMore Information needed"
] | [
6,
34
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"high_vs_random_with_med_low_min_100_issues_per_repo\"\n\nMore Information needed"
] |
7a25099c5fc3f443bc1b3a73dd274bd0982ac4e5 |
# Aya_ja
<!-- Provide a quick summary of the dataset. -->
このデータセットは`CohereForAI/aya_dataset`の日本語インストラクションデータのみを抽出したデータセットです。
人手でアノテーションされた指示応答のペアが6,259件収録されています。
## pythonでの使用例
```python
from datasets import load_dataset
aya_ja = load_dataset(
"ryota39/Aya_ja",
split='train',
)
```
## 例
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
```json
[
{
"inputs": "火縄銃の威力が全国に知られる事となった、1575年に織田・徳川連合軍が鉄砲隊を用いて武田勝頼率いる騎馬隊を破った戦いを何というでしょう?",
"targets": "長篠の戦いです。",
"language": "Japanese",
"language_code": "jpn",
"annotation_type": "original-annotations",
"user_id": "9881e959174fc20243c2b43c01599473325a93d056e73dbc20a9a0a03514026e"
},
{
"inputs": "陸上のリレー競技で次の走者に渡すのはバトンですが、駅伝競技で次の走者に渡すのは何でしょう?",
"targets": "たすきです。",
"language": "Japanese",
"language_code": "jpn",
"annotation_type": "original-annotations",
"user_id": "9881e959174fc20243c2b43c01599473325a93d056e73dbc20a9a0a03514026e"
},
{
"inputs": "路線図上は、品川駅と田町駅の間に位置している、2020年3月14日に開業したJR東日本・山手線の新駅の名称は何?",
"targets": "高輪ゲートウェイ駅です。",
"language": "Japanese",
"language_code": "jpn",
"annotation_type": "original-annotations",
"user_id": "9881e959174fc20243c2b43c01599473325a93d056e73dbc20a9a0a03514026e"
},
]
```
## 参考
[CohereForAI/aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset)
| ryota39/Aya_ja | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:ja",
"license:apache-2.0",
"region:us"
] | 2024-02-14T08:03:42+00:00 | {"language": ["ja"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"]} | 2024-02-14T08:25:06+00:00 | [] | [
"ja"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-Japanese #license-apache-2.0 #region-us
|
# Aya_ja
このデータセットは'CohereForAI/aya_dataset'の日本語インストラクションデータのみを抽出したデータセットです。
人手でアノテーションされた指示応答のペアが6,259件収録されています。
## pythonでの使用例
## 例
## 参考
CohereForAI/aya_dataset
| [
"# Aya_ja\n\n\n\nこのデータセットは'CohereForAI/aya_dataset'の日本語インストラクションデータのみを抽出したデータセットです。\n\n人手でアノテーションされた指示応答のペアが6,259件収録されています。",
"## pythonでの使用例",
"## 例",
"## 参考\nCohereForAI/aya_dataset"
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-Japanese #license-apache-2.0 #region-us \n",
"# Aya_ja\n\n\n\nこのデータセットは'CohereForAI/aya_dataset'の日本語インストラクションデータのみを抽出したデータセットです。\n\n人手でアノテーションされた指示応答のペアが6,259件収録されています。",
"## pythonでの使用例",
"## 例",
"## 参考\nCohereForAI/aya_dataset"
] | [
55,
59,
6,
3,
12
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-Japanese #license-apache-2.0 #region-us \n# Aya_ja\n\n\n\nこのデータセットは'CohereForAI/aya_dataset'の日本語インストラクションデータのみを抽出したデータセットです。\n\n人手でアノテーションされた指示応答のペアが6,259件収録されています。## pythonでの使用例## 例## 参考\nCohereForAI/aya_dataset"
] |
99031028286b8ca2428464f825dc4903a8d77af6 |
# Dataset Card for Evaluation run of tyson0420/stack_llama_fil_ai
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/stack_llama_fil_ai](https://huggingface.co/tyson0420/stack_llama_fil_ai) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__stack_llama_fil_ai",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T08:29:29.745908](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama_fil_ai/blob/main/results_2024-02-14T08-29-29.745908.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46425954873484,
"acc_stderr": 0.03448386126773048,
"acc_norm": 0.4690334992646356,
"acc_norm_stderr": 0.03527552810754872,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283364,
"mc2": 0.38723670137430055,
"mc2_stderr": 0.013588451294593075
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.01461119932984378,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.01457558392201967
},
"harness|hellaswag|10": {
"acc": 0.5892252539334794,
"acc_stderr": 0.004909689876342044,
"acc_norm": 0.7862975502887871,
"acc_norm_stderr": 0.004090813948220234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296557,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296557
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.02264421261552521,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.02264421261552521
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.02498535492310233,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.02498535492310233
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6165137614678899,
"acc_stderr": 0.02084715664191598,
"acc_norm": 0.6165137614678899,
"acc_norm_stderr": 0.02084715664191598
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791044,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834948,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6168582375478927,
"acc_stderr": 0.017384774194885627,
"acc_norm": 0.6168582375478927,
"acc_norm_stderr": 0.017384774194885627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.028580341065138293,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.028580341065138293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35723598435462844,
"acc_stderr": 0.012238615750316503,
"acc_norm": 0.35723598435462844,
"acc_norm_stderr": 0.012238615750316503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.020054269200726456,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.020054269200726456
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42448979591836733,
"acc_stderr": 0.031642094879429414,
"acc_norm": 0.42448979591836733,
"acc_norm_stderr": 0.031642094879429414
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283364,
"mc2": 0.38723670137430055,
"mc2_stderr": 0.013588451294593075
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259769
},
"harness|gsm8k|5": {
"acc": 0.1281273692191054,
"acc_stderr": 0.009206398549980031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tyson0420__stack_llama_fil_ai | [
"region:us"
] | 2024-02-14T08:24:56+00:00 | {"pretty_name": "Evaluation run of tyson0420/stack_llama_fil_ai", "dataset_summary": "Dataset automatically created during the evaluation run of model [tyson0420/stack_llama_fil_ai](https://huggingface.co/tyson0420/stack_llama_fil_ai) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__stack_llama_fil_ai\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T08:29:29.745908](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama_fil_ai/blob/main/results_2024-02-14T08-29-29.745908.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46425954873484,\n \"acc_stderr\": 0.03448386126773048,\n \"acc_norm\": 0.4690334992646356,\n \"acc_norm_stderr\": 0.03527552810754872,\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283364,\n \"mc2\": 0.38723670137430055,\n \"mc2_stderr\": 0.013588451294593075\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.01461119932984378,\n \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.01457558392201967\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5892252539334794,\n \"acc_stderr\": 0.004909689876342044,\n \"acc_norm\": 0.7862975502887871,\n \"acc_norm_stderr\": 0.004090813948220234\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296557,\n \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296557\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270658,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270658\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.02264421261552521,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.02264421261552521\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.02498535492310233,\n \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.02498535492310233\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.03169380235712997,\n \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.03169380235712997\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6165137614678899,\n \"acc_stderr\": 0.02084715664191598,\n \"acc_norm\": 0.6165137614678899,\n \"acc_norm_stderr\": 0.02084715664191598\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791044,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n \"acc_stderr\": 0.02999695185834948,\n \"acc_norm\": 0.7008547008547008,\n \"acc_norm_stderr\": 0.02999695185834948\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6168582375478927,\n \"acc_stderr\": 0.017384774194885627,\n \"acc_norm\": 0.6168582375478927,\n \"acc_norm_stderr\": 0.017384774194885627\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.02691864538323901,\n \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.02691864538323901\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.028580341065138293,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.028580341065138293\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35723598435462844,\n \"acc_stderr\": 0.012238615750316503,\n \"acc_norm\": 0.35723598435462844,\n \"acc_norm_stderr\": 0.012238615750316503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213528,\n \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213528\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.020054269200726456,\n \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.020054269200726456\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.42448979591836733,\n \"acc_stderr\": 0.031642094879429414,\n \"acc_norm\": 0.42448979591836733,\n \"acc_norm_stderr\": 0.031642094879429414\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n \"mc1_stderr\": 0.015274176219283364,\n \"mc2\": 0.38723670137430055,\n \"mc2_stderr\": 0.013588451294593075\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259769\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1281273692191054,\n \"acc_stderr\": 0.009206398549980031\n }\n}\n```", "repo_url": "https://huggingface.co/tyson0420/stack_llama_fil_ai", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|arc:challenge|25_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|arc:challenge|25_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|gsm8k|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|gsm8k|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hellaswag|10_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hellaswag|10_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T08-22-30.824549.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T08-29-29.745908.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["**/details_harness|winogrande|5_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["**/details_harness|winogrande|5_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T08-29-29.745908.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T08_22_30.824549", "path": ["results_2024-02-14T08-22-30.824549.parquet"]}, {"split": "2024_02_14T08_29_29.745908", "path": ["results_2024-02-14T08-29-29.745908.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T08-29-29.745908.parquet"]}]}]} | 2024-02-14T08:32:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tyson0420/stack_llama_fil_ai
Dataset automatically created during the evaluation run of model tyson0420/stack_llama_fil_ai on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T08:29:29.745908(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tyson0420/stack_llama_fil_ai\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama_fil_ai on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T08:29:29.745908(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tyson0420/stack_llama_fil_ai\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama_fil_ai on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T08:29:29.745908(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of tyson0420/stack_llama_fil_ai\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama_fil_ai on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T08:29:29.745908(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
ca7f88d953fdd4c59c5fae3edcbe3c45539d2a5d |
The Dataset comprises Mid-Wave Infrared Images (thermal images) and videos for Point Objects.
The Images have three classes: the Rocket, the Rocket Debries and the Vehicle class.
It consists of 14 sequences of images with images of their respective GroundTruths and their Corresponding Video Sequences.
Every Sequence contains Mwir Images and their respective GroundTruths. | avinres/MWIRSTD | [
"license:cc",
"region:us"
] | 2024-02-14T08:34:43+00:00 | {"license": "cc"} | 2024-02-14T08:51:39+00:00 | [] | [] | TAGS
#license-cc #region-us
|
The Dataset comprises Mid-Wave Infrared Images (thermal images) and videos for Point Objects.
The Images have three classes: the Rocket, the Rocket Debries and the Vehicle class.
It consists of 14 sequences of images with images of their respective GroundTruths and their Corresponding Video Sequences.
Every Sequence contains Mwir Images and their respective GroundTruths. | [] | [
"TAGS\n#license-cc #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-cc #region-us \n"
] |
a507743f47a1bc1eebfd04dffaff6fdd32cd470d | # Natural Questions Open Dataset with Gold Documents
This dataset is a curated version of the [Natural Questions open dataset](https://huggingface.co/datasets/nq_open),
with the inclusion of the gold documents from the original [Natural Questions](https://huggingface.co/datasets/natural_questions) (NQ) dataset.
The main difference with the NQ-open dataset is that some entries were excluded, as their respective gold documents exceeded 512 tokens in length.
This is due to the pre-processing of the gold documents, as detailed in this related [dataset](https://huggingface.co/datasets/florin-hf/wiki_dump2018_nq_open).
The dataset is designed to facilitate research in question-answering systems, especially focusing on integrating gold documents for training and testing purposes.
## Dataset Sources
The Natural Questions (NQ) dataset is a large-scale collection of real-world queries derived from Google search data. Each
entry in the dataset consists of a user query and the corresponding Wikipedia page containing the answer.
The NQ-open dataset, a subset of the NQ dataset, differs by removing the restriction of linking answers to specific Wikipedia passages, thereby
mimicking a more general information retrieval scenario similar to web searches.
This version of the NQ-open dataset was used in the paper [The Power of Noise: Redefining Retrieval for RAG Systems](https://arxiv.org/abs/2401.14887).
## Dataset Structure
A sample in the dataset has the following format:
```
{
'example_id' (int64): an identifier for the question, consistent with the original NQ dataset,
'question' (str): a question, that is identical to the question in the original NQ,
'answers' (List[str]): the list of correct answers in the original NQ,
'text' (str): gold document, associated with the question, in the original NQ,
'idx_gold_in_corpus' (int64): index of the gold document in the full corpus.
}
Ex.
{
'example_id': -3440030035760311385,
'question': 'who owned the millennium falcon before han solo',
'answers': [Lando Calrissian],
'text': "Han Solo won the Millennium Falcon from Lando Calrissian in the card game ' sabacc ' several years before the events of the film A New Hope..."
'idx_gold_in_corpus': 20995349
}
```
## Dataset Splits
- **Train set**: 72,209 (50,2 MB)
- **Validation set**: 8,006 (5,57 BM)
- **Test set**: 2889 (1,96 MB)
## Citation Information
```
@article{doi:10.1162/tacl\_a\_00276,
author = {Kwiatkowski, Tom and Palomaki, Jennimaria and Redfield, Olivia and Collins, Michael and Parikh, Ankur and Alberti, Chris and Epstein, Danielle and Polosukhin, Illia and Devlin, Jacob and Lee, Kenton and Toutanova, Kristina and Jones, Llion and Kelcey, Matthew and Chang, Ming-Wei and Dai, Andrew M. and Uszkoreit, Jakob and Le, Quoc and Petrov, Slav},
title = {Natural Questions: A Benchmark for Question Answering Research},
journal = {Transactions of the Association for Computational Linguistics},
volume = {7},
number = {},
pages = {453-466},
year = {2019},
doi = {10.1162/tacl\_a\_00276},
URL = {
https://doi.org/10.1162/tacl_a_00276
},
eprint = {
https://doi.org/10.1162/tacl_a_00276
},
abstract = { We present the Natural Questions corpus, a question answering data set. Questions consist of real anonymized, aggregated queries issued to the Google search engine. An annotator is presented with a question along with a Wikipedia page from the top 5 search results, and annotates a long answer (typically a paragraph) and a short answer (one or more entities) if present on the page, or marks null if no long/short answer is present. The public release consists of 307,373 training examples with single annotations; 7,830 examples with 5-way annotations for development data; and a further 7,842 examples with 5-way annotated sequestered as test data. We present experiments validating quality of the data. We also describe analysis of 25-way annotations on 302 examples, giving insights into human variability on the annotation task. We introduce robust metrics for the purposes of evaluating question answering systems; demonstrate high human upper bounds on these metrics; and establish baseline results using competitive methods drawn from related literature. }
}
@inproceedings{lee-etal-2019-latent,
title = "Latent Retrieval for Weakly Supervised Open Domain Question Answering",
author = "Lee, Kenton and
Chang, Ming-Wei and
Toutanova, Kristina",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1612",
doi = "10.18653/v1/P19-1612",
pages = "6086--6096",
abstract = "Recent work on open domain question answering (QA) assumes strong supervision of the supporting evidence and/or assumes a blackbox information retrieval (IR) system to retrieve evidence candidates. We argue that both are suboptimal, since gold evidence is not always available, and QA is fundamentally different from IR. We show for the first time that it is possible to jointly learn the retriever and reader from question-answer string pairs and without any IR system. In this setting, evidence retrieval from all of Wikipedia is treated as a latent variable. Since this is impractical to learn from scratch, we pre-train the retriever with an Inverse Cloze Task. We evaluate on open versions of five QA datasets. On datasets where the questioner already knows the answer, a traditional IR system such as BM25 is sufficient. On datasets where a user is genuinely seeking an answer, we show that learned retrieval is crucial, outperforming BM25 by up to 19 points in exact match.",
}
@misc{cuconasu2024power,
title={The Power of Noise: Redefining Retrieval for RAG Systems},
author={Florin Cuconasu and Giovanni Trappolini and Federico Siciliano and Simone Filice and Cesare Campagnano and Yoelle Maarek and Nicola Tonellotto and Fabrizio Silvestri},
year={2024},
eprint={2401.14887},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
``` | florin-hf/nq_open_gold | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"arxiv:2401.14887",
"region:us"
] | 2024-02-14T08:50:05+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"]} | 2024-02-16T09:30:14+00:00 | [
"2401.14887"
] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-10K<n<100K #language-English #arxiv-2401.14887 #region-us
| # Natural Questions Open Dataset with Gold Documents
This dataset is a curated version of the Natural Questions open dataset,
with the inclusion of the gold documents from the original Natural Questions (NQ) dataset.
The main difference with the NQ-open dataset is that some entries were excluded, as their respective gold documents exceeded 512 tokens in length.
This is due to the pre-processing of the gold documents, as detailed in this related dataset.
The dataset is designed to facilitate research in question-answering systems, especially focusing on integrating gold documents for training and testing purposes.
## Dataset Sources
The Natural Questions (NQ) dataset is a large-scale collection of real-world queries derived from Google search data. Each
entry in the dataset consists of a user query and the corresponding Wikipedia page containing the answer.
The NQ-open dataset, a subset of the NQ dataset, differs by removing the restriction of linking answers to specific Wikipedia passages, thereby
mimicking a more general information retrieval scenario similar to web searches.
This version of the NQ-open dataset was used in the paper The Power of Noise: Redefining Retrieval for RAG Systems.
## Dataset Structure
A sample in the dataset has the following format:
## Dataset Splits
- Train set: 72,209 (50,2 MB)
- Validation set: 8,006 (5,57 BM)
- Test set: 2889 (1,96 MB)
| [
"# Natural Questions Open Dataset with Gold Documents\n\nThis dataset is a curated version of the Natural Questions open dataset, \nwith the inclusion of the gold documents from the original Natural Questions (NQ) dataset.\nThe main difference with the NQ-open dataset is that some entries were excluded, as their respective gold documents exceeded 512 tokens in length.\nThis is due to the pre-processing of the gold documents, as detailed in this related dataset.\n\nThe dataset is designed to facilitate research in question-answering systems, especially focusing on integrating gold documents for training and testing purposes.",
"## Dataset Sources\nThe Natural Questions (NQ) dataset is a large-scale collection of real-world queries derived from Google search data. Each\nentry in the dataset consists of a user query and the corresponding Wikipedia page containing the answer. \n\nThe NQ-open dataset, a subset of the NQ dataset, differs by removing the restriction of linking answers to specific Wikipedia passages, thereby\nmimicking a more general information retrieval scenario similar to web searches.\n\nThis version of the NQ-open dataset was used in the paper The Power of Noise: Redefining Retrieval for RAG Systems.",
"## Dataset Structure\n\nA sample in the dataset has the following format:",
"## Dataset Splits\n\n- Train set: 72,209 (50,2 MB)\n- Validation set: 8,006 (5,57 BM)\n- Test set: 2889 (1,96 MB)"
] | [
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #arxiv-2401.14887 #region-us \n",
"# Natural Questions Open Dataset with Gold Documents\n\nThis dataset is a curated version of the Natural Questions open dataset, \nwith the inclusion of the gold documents from the original Natural Questions (NQ) dataset.\nThe main difference with the NQ-open dataset is that some entries were excluded, as their respective gold documents exceeded 512 tokens in length.\nThis is due to the pre-processing of the gold documents, as detailed in this related dataset.\n\nThe dataset is designed to facilitate research in question-answering systems, especially focusing on integrating gold documents for training and testing purposes.",
"## Dataset Sources\nThe Natural Questions (NQ) dataset is a large-scale collection of real-world queries derived from Google search data. Each\nentry in the dataset consists of a user query and the corresponding Wikipedia page containing the answer. \n\nThe NQ-open dataset, a subset of the NQ dataset, differs by removing the restriction of linking answers to specific Wikipedia passages, thereby\nmimicking a more general information retrieval scenario similar to web searches.\n\nThis version of the NQ-open dataset was used in the paper The Power of Noise: Redefining Retrieval for RAG Systems.",
"## Dataset Structure\n\nA sample in the dataset has the following format:",
"## Dataset Splits\n\n- Train set: 72,209 (50,2 MB)\n- Validation set: 8,006 (5,57 BM)\n- Test set: 2889 (1,96 MB)"
] | [
43,
135,
145,
17,
43
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #language-English #arxiv-2401.14887 #region-us \n# Natural Questions Open Dataset with Gold Documents\n\nThis dataset is a curated version of the Natural Questions open dataset, \nwith the inclusion of the gold documents from the original Natural Questions (NQ) dataset.\nThe main difference with the NQ-open dataset is that some entries were excluded, as their respective gold documents exceeded 512 tokens in length.\nThis is due to the pre-processing of the gold documents, as detailed in this related dataset.\n\nThe dataset is designed to facilitate research in question-answering systems, especially focusing on integrating gold documents for training and testing purposes.## Dataset Sources\nThe Natural Questions (NQ) dataset is a large-scale collection of real-world queries derived from Google search data. Each\nentry in the dataset consists of a user query and the corresponding Wikipedia page containing the answer. \n\nThe NQ-open dataset, a subset of the NQ dataset, differs by removing the restriction of linking answers to specific Wikipedia passages, thereby\nmimicking a more general information retrieval scenario similar to web searches.\n\nThis version of the NQ-open dataset was used in the paper The Power of Noise: Redefining Retrieval for RAG Systems.## Dataset Structure\n\nA sample in the dataset has the following format:## Dataset Splits\n\n- Train set: 72,209 (50,2 MB)\n- Validation set: 8,006 (5,57 BM)\n- Test set: 2889 (1,96 MB)"
] |
fe373b5974a3270d2af48707f10ce1e9b87aa592 | # Dataset Card for "ft-capstone"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | rpadilla/ft-capstone | [
"region:us"
] | 2024-02-14T08:51:49+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2614, "num_examples": 12}, {"name": "test", "num_bytes": 1969, "num_examples": 7}], "download_size": 6491, "dataset_size": 4583}} | 2024-02-14T08:51:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ft-capstone"
More Information needed | [
"# Dataset Card for \"ft-capstone\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ft-capstone\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ft-capstone\"\n\nMore Information needed"
] |
33f9a2106d9cb59970718e347cf77e0302882b74 |
# C4, T5 tokenized, in ragged array format
Processed distribution of Google's [C4](https://www.tensorflow.org/datasets/catalog/c4) dataset: a colossal, cleaned version of [Common Crawl](https://commoncrawl.org)'s web crawl corpus.
Uses the text data from [`allenai/c4`](https://huggingface.co/datasets/allenai/c4).
Includes `en` subset only.
T5 tokenizer was applied to the text.
Distributed as a ragged array.
Converted via [`json_to_ragged.py`](https://github.com/Birch-san/pre-tokenize/blob/main/script/json_to_ragged.py).
Download size of all shards:
| Split | Data+Lengths Size | Divided across `n` Shards | Typical shard size: `data.npy` | Typical shard size: `len.npy` |
|-|-|-|-|-|
| Train | 293G | 1024 | 344M | 1.4M |
| Test | 299M | 8 | 44M | 179K |
| **Total** | **296G** | _N/A_ | _N/A_ | _N/A_ |
The data is uncompressed, in order to preserve support for random-seeking.
`.data.npy` would probably benefit from compression, because token sequences exhibit patterns.
Tokenization achieves a ~44% compression ratio.
Allen AI's original gzipped JSONL text data achieved a ~61% compression ratio.
So tokenized is about 13% bigger.
Download everything via:
```bash
pip install hf_transfer huggingface-cli
HF_HUB_ENABLE_HF_TRANSFER=True huggingface-cli download --repo-type dataset --local-dir . --local-dir-use-symlinks False Birchlabs/c4-t5-ragged .
```
Download a single ragged array to try it out:
```bash
huggingface-cli download --repo-type dataset --local-dir . --local-dir-use-symlinks False Birchlabs/c4-t5-ragged en/validation/c4-validation.00000-of-00008.{data,len}.npy
```
Read ragged arrays like so:
https://github.com/Birch-san/pre-tokenize/blob/main/script/read_ragged.py
The basic idea is:
`data.npy` is a very long 1D numpy array of tokens.
`len.npy` is a 1D numpy array describing how long is each sample in `data.npy`.
To read sample 0 from `data.npy`, you would:
- start at index 0 in `data.npy`
- check sample 0's length (position 0 in `len.npy`)
- read from index 0 to index 0 + length-of-sample-0
To read sample 1 from `data.npy`, you would:
- start at the end of sample 0.
- check sample 1's length (position 1 in `len.npy`)
- read from end-of-sample-0 to end-of-sample-0 + length-of-sample-1
We can obtain an index of sample ending positions by adding each sample length as we go along (lengths.cumsum()).
We can obtain an index of sample starting positions by prepending the aforementioned endings index with a 0.
[`read_ragged.py`](https://github.com/Birch-san/pre-tokenize/blob/main/script/read_ragged.py) demonstrates how to create this index, and use it to achieve random access.
**This isn't ready for use in torch DataLoader.**
This dataset format is intended as a _precursor_, from which you could create a dataset in a different format.
For example, you might want to iterate over every sample here, chunking by a fixed context length, and output the samples via .parquet chunks for use with torch DataLoader.
That's an easy way out, but your disk won't thank you if you do fully-random access.
An approach that hits the disk less / requires less RAM, would be to implement an IterableDataset, where you iterate sequentially over shards but shuffle within-shard (or shuffle within a smaller-than-shard buffer).
You might also want to perform analyses over the `.len.npy` to decide how to pack these sequences (e.g. packing a 128 and 384 sequence into a 512 context length).
You can do such an analysis via GraphCore's [packedBERT](https://github.com/graphcore/tutorials/tree/sdk-release-2.1/blogs_code/packedBERT).
Then you could process the data into a "packed" dataset.
### Source Data
#### Initial Data Collection and Normalization
The C4 and mC4 datasets are collections text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in [c4.py](https://github.com/tensorflow/datasets/blob/5952d3d60d60e1727786fa7a9a23d24bb463d4d6/tensorflow_datasets/text/c4.py) by Tensorflow Datasets.
C4 dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by [langdetect](https://github.com/Mimino666/langdetect) was discarded.
To build mC4, the authors used [CLD3](https://github.com/google/cld3) to identify over 100 languages.
### Licensing Information
We are releasing this dataset under the terms of [ODC-BY](https://opendatacommons.org/licenses/by/1-0/). By using this, you are also bound by the [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/) in respect of the content contained in the dataset.
### Acknowledgements
Big ups to the good folks at [Common Crawl](https://commoncrawl.org) whose data made this possible ([consider donating](http://commoncrawl.org/donate/)!), to Google for creating the code that curates and filters the data, and to Huggingface, who had no issue with hosting these 3TB of data for public download!
Thanks [Allen AI](https://allenai.org/) for sharing the text that was processed to make this dataset. | Birchlabs/c4-t5-ragged | [
"task_categories:text-generation",
"task_categories:fill-mask",
"task_ids:language-modeling",
"task_ids:masked-language-modeling",
"annotations_creators:no-annotation",
"language_creators:found",
"size_categories:n<1K",
"size_categories:1K<n<10K",
"size_categories:10K<n<100K",
"size_categories:100K<n<1M",
"size_categories:1M<n<10M",
"size_categories:10M<n<100M",
"size_categories:100M<n<1B",
"size_categories:1B<n<10B",
"source_datasets:original",
"language:en",
"license:odc-by",
"region:us"
] | 2024-02-14T09:05:48+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["found"], "language": ["en"], "license": ["odc-by"], "size_categories": ["n<1K", "1K<n<10K", "10K<n<100K", "100K<n<1M", "1M<n<10M", "10M<n<100M", "100M<n<1B", "1B<n<10B"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "paperswithcode_id": "c4", "pretty_name": "C4"} | 2024-02-16T01:37:46+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-found #size_categories-n<1K #size_categories-1K<n<10K #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #size_categories-10M<n<100M #size_categories-100M<n<1B #size_categories-1B<n<10B #source_datasets-original #language-English #license-odc-by #region-us
| C4, T5 tokenized, in ragged array format
========================================
Processed distribution of Google's C4 dataset: a colossal, cleaned version of Common Crawl's web crawl corpus.
Uses the text data from 'allenai/c4'.
Includes 'en' subset only.
T5 tokenizer was applied to the text.
Distributed as a ragged array.
Converted via 'json\_to\_ragged.py'.
Download size of all shards:
The data is uncompressed, in order to preserve support for random-seeking.
'.URL' would probably benefit from compression, because token sequences exhibit patterns.
Tokenization achieves a ~44% compression ratio.
Allen AI's original gzipped JSONL text data achieved a ~61% compression ratio.
So tokenized is about 13% bigger.
Download everything via:
Download a single ragged array to try it out:
Read ragged arrays like so:
URL
The basic idea is:
'URL' is a very long 1D numpy array of tokens.
'URL' is a 1D numpy array describing how long is each sample in 'URL'.
To read sample 0 from 'URL', you would:
* start at index 0 in 'URL'
* check sample 0's length (position 0 in 'URL')
* read from index 0 to index 0 + length-of-sample-0
To read sample 1 from 'URL', you would:
* start at the end of sample 0.
* check sample 1's length (position 1 in 'URL')
* read from end-of-sample-0 to end-of-sample-0 + length-of-sample-1
We can obtain an index of sample ending positions by adding each sample length as we go along (URL()).
We can obtain an index of sample starting positions by prepending the aforementioned endings index with a 0.
'read\_ragged.py' demonstrates how to create this index, and use it to achieve random access.
This isn't ready for use in torch DataLoader.
This dataset format is intended as a *precursor*, from which you could create a dataset in a different format.
For example, you might want to iterate over every sample here, chunking by a fixed context length, and output the samples via .parquet chunks for use with torch DataLoader.
That's an easy way out, but your disk won't thank you if you do fully-random access.
An approach that hits the disk less / requires less RAM, would be to implement an IterableDataset, where you iterate sequentially over shards but shuffle within-shard (or shuffle within a smaller-than-shard buffer).
You might also want to perform analyses over the '.URL' to decide how to pack these sequences (e.g. packing a 128 and 384 sequence into a 512 context length).
You can do such an analysis via GraphCore's packedBERT.
Then you could process the data into a "packed" dataset.
### Source Data
#### Initial Data Collection and Normalization
The C4 and mC4 datasets are collections text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in URL by Tensorflow Datasets.
C4 dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by langdetect was discarded.
To build mC4, the authors used CLD3 to identify over 100 languages.
### Licensing Information
We are releasing this dataset under the terms of ODC-BY. By using this, you are also bound by the Common Crawl terms of use in respect of the content contained in the dataset.
### Acknowledgements
Big ups to the good folks at Common Crawl whose data made this possible (consider donating!), to Google for creating the code that curates and filters the data, and to Huggingface, who had no issue with hosting these 3TB of data for public download!
Thanks Allen AI for sharing the text that was processed to make this dataset.
| [
"### Source Data",
"#### Initial Data Collection and Normalization\n\n\nThe C4 and mC4 datasets are collections text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in URL by Tensorflow Datasets.\n\n\nC4 dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by langdetect was discarded.\n\n\nTo build mC4, the authors used CLD3 to identify over 100 languages.",
"### Licensing Information\n\n\nWe are releasing this dataset under the terms of ODC-BY. By using this, you are also bound by the Common Crawl terms of use in respect of the content contained in the dataset.",
"### Acknowledgements\n\n\nBig ups to the good folks at Common Crawl whose data made this possible (consider donating!), to Google for creating the code that curates and filters the data, and to Huggingface, who had no issue with hosting these 3TB of data for public download!\n\n\nThanks Allen AI for sharing the text that was processed to make this dataset."
] | [
"TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-found #size_categories-n<1K #size_categories-1K<n<10K #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #size_categories-10M<n<100M #size_categories-100M<n<1B #size_categories-1B<n<10B #source_datasets-original #language-English #license-odc-by #region-us \n",
"### Source Data",
"#### Initial Data Collection and Normalization\n\n\nThe C4 and mC4 datasets are collections text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in URL by Tensorflow Datasets.\n\n\nC4 dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by langdetect was discarded.\n\n\nTo build mC4, the authors used CLD3 to identify over 100 languages.",
"### Licensing Information\n\n\nWe are releasing this dataset under the terms of ODC-BY. By using this, you are also bound by the Common Crawl terms of use in respect of the content contained in the dataset.",
"### Acknowledgements\n\n\nBig ups to the good folks at Common Crawl whose data made this possible (consider donating!), to Google for creating the code that curates and filters the data, and to Huggingface, who had no issue with hosting these 3TB of data for public download!\n\n\nThanks Allen AI for sharing the text that was processed to make this dataset."
] | [
186,
4,
152,
52,
80
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-no-annotation #language_creators-found #size_categories-n<1K #size_categories-1K<n<10K #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #size_categories-10M<n<100M #size_categories-100M<n<1B #size_categories-1B<n<10B #source_datasets-original #language-English #license-odc-by #region-us \n### Source Data#### Initial Data Collection and Normalization\n\n\nThe C4 and mC4 datasets are collections text sourced from the public Common Crawl web scrape. It includes heuristics to extract only natural language (as opposed to boilerplate and other gibberish) in addition to extensive deduplication. You can find the code that has been used to build this dataset in URL by Tensorflow Datasets.\n\n\nC4 dataset was explicitly designed to be English only: any page that was not given a probability of at least 99% of being English by langdetect was discarded.\n\n\nTo build mC4, the authors used CLD3 to identify over 100 languages.### Licensing Information\n\n\nWe are releasing this dataset under the terms of ODC-BY. By using this, you are also bound by the Common Crawl terms of use in respect of the content contained in the dataset.### Acknowledgements\n\n\nBig ups to the good folks at Common Crawl whose data made this possible (consider donating!), to Google for creating the code that curates and filters the data, and to Huggingface, who had no issue with hosting these 3TB of data for public download!\n\n\nThanks Allen AI for sharing the text that was processed to make this dataset."
] |
230caebf06ac160c4f78c997baa4b33fabde42f0 |
# Dataset Card for Evaluation run of nxn1231/yi6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nxn1231/yi6](https://huggingface.co/nxn1231/yi6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nxn1231__yi6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T09:04:41.183373](https://huggingface.co/datasets/open-llm-leaderboard/details_nxn1231__yi6/blob/main/results_2024-02-14T09-04-41.183373.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5313784031327082,
"acc_stderr": 0.03373047704739948,
"acc_norm": 0.5418339285873789,
"acc_norm_stderr": 0.034546398177107925,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487293,
"mc2": 0.35799585364198633,
"mc2_stderr": 0.013558704887914389
},
"harness|arc:challenge|25": {
"acc": 0.43686006825938567,
"acc_stderr": 0.014494421584256525,
"acc_norm": 0.4778156996587031,
"acc_norm_stderr": 0.014597001927076133
},
"harness|hellaswag|10": {
"acc": 0.4789882493527186,
"acc_stderr": 0.0049853735507750995,
"acc_norm": 0.6825333598884684,
"acc_norm_stderr": 0.004645393477680678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806231,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806231
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.03883565977956929,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.03883565977956929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933875,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933875
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000686,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510175,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.030165137867847004,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.030165137867847004
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.0445325483632647,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.0445325483632647
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722926,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6615581098339719,
"acc_stderr": 0.016920869586210665,
"acc_norm": 0.6615581098339719,
"acc_norm_stderr": 0.016920869586210665
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.026152198619726792,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.026152198619726792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850417,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850417
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829027,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02764847787741332,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02764847787741332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400664,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.0201845833591022,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.0201845833591022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024978,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024978
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573037,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573037
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.03645981377388806,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.03645981377388806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487293,
"mc2": 0.35799585364198633,
"mc2_stderr": 0.013558704887914389
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599952
},
"harness|gsm8k|5": {
"acc": 0.04397270659590599,
"acc_stderr": 0.005647666449126459
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nxn1231__yi6 | [
"region:us"
] | 2024-02-14T09:06:51+00:00 | {"pretty_name": "Evaluation run of nxn1231/yi6", "dataset_summary": "Dataset automatically created during the evaluation run of model [nxn1231/yi6](https://huggingface.co/nxn1231/yi6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nxn1231__yi6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T09:04:41.183373](https://huggingface.co/datasets/open-llm-leaderboard/details_nxn1231__yi6/blob/main/results_2024-02-14T09-04-41.183373.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5313784031327082,\n \"acc_stderr\": 0.03373047704739948,\n \"acc_norm\": 0.5418339285873789,\n \"acc_norm_stderr\": 0.034546398177107925,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487293,\n \"mc2\": 0.35799585364198633,\n \"mc2_stderr\": 0.013558704887914389\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43686006825938567,\n \"acc_stderr\": 0.014494421584256525,\n \"acc_norm\": 0.4778156996587031,\n \"acc_norm_stderr\": 0.014597001927076133\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4789882493527186,\n \"acc_stderr\": 0.0049853735507750995,\n \"acc_norm\": 0.6825333598884684,\n \"acc_norm_stderr\": 0.004645393477680678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806231,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806231\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300642,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300642\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933875,\n \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933875\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000686,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510175,\n \"acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510175\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847004,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847004\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.0445325483632647,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.0445325483632647\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.025372139671722926,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.025372139671722926\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6615581098339719,\n \"acc_stderr\": 0.016920869586210665,\n \"acc_norm\": 0.6615581098339719,\n \"acc_norm_stderr\": 0.016920869586210665\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.026152198619726792,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.026152198619726792\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850417,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850417\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829027,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829027\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02764847787741332,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02764847787741332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291477,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n \"acc_stderr\": 0.012585471793400664,\n \"acc_norm\": 0.4152542372881356,\n \"acc_norm_stderr\": 0.012585471793400664\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159703,\n \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159703\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.0201845833591022,\n \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.0201845833591022\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024978,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024978\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573037,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573037\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.03645981377388806,\n \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.03645981377388806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487293,\n \"mc2\": 0.35799585364198633,\n \"mc2_stderr\": 0.013558704887914389\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599952\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04397270659590599,\n \"acc_stderr\": 0.005647666449126459\n }\n}\n```", "repo_url": "https://huggingface.co/nxn1231/yi6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-04-41.183373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["**/details_harness|winogrande|5_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T09-04-41.183373.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T09_04_41.183373", "path": ["results_2024-02-14T09-04-41.183373.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T09-04-41.183373.parquet"]}]}]} | 2024-02-14T09:07:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nxn1231/yi6
Dataset automatically created during the evaluation run of model nxn1231/yi6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T09:04:41.183373(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nxn1231/yi6\n\n\n\nDataset automatically created during the evaluation run of model nxn1231/yi6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:04:41.183373(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nxn1231/yi6\n\n\n\nDataset automatically created during the evaluation run of model nxn1231/yi6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:04:41.183373(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nxn1231/yi6\n\n\n\nDataset automatically created during the evaluation run of model nxn1231/yi6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T09:04:41.183373(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8dbef55409dcbe3be865ecbac9e1039999feac59 | # Dataset Card for "synthetic_RAG_dataset_ger_de_v02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | SebastianBodza/synthetic_RAG_dataset_ger_de_v02 | [
"language:de",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-14T09:08:05+00:00 | {"language": ["de"], "license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "topic", "dtype": "string"}, {"name": "questions", "dtype": "string"}, {"name": "gen_questions", "dtype": "string"}, {"name": "Imperative Form", "dtype": "string"}, {"name": "Question", "dtype": "string"}, {"name": "Search String", "dtype": "string"}, {"name": "Positive", "dtype": "string"}, {"name": "Hard Negative", "dtype": "string"}, {"name": "raw_texts", "dtype": "string"}, {"name": "index", "dtype": "int64"}], "splits": [{"name": "raw", "num_bytes": 300511330, "num_examples": 82651}, {"name": "filtered", "num_bytes": 513227802, "num_examples": 79637}], "download_size": 338673463, "dataset_size": 813739132}, "configs": [{"config_name": "default", "data_files": [{"split": "raw", "path": "data/raw-*"}, {"split": "filtered", "path": "data/filtered-*"}]}]} | 2024-02-14T10:40:39+00:00 | [] | [
"de"
] | TAGS
#language-German #license-cc-by-nc-4.0 #region-us
| # Dataset Card for "synthetic_RAG_dataset_ger_de_v02"
More Information needed | [
"# Dataset Card for \"synthetic_RAG_dataset_ger_de_v02\"\n\nMore Information needed"
] | [
"TAGS\n#language-German #license-cc-by-nc-4.0 #region-us \n",
"# Dataset Card for \"synthetic_RAG_dataset_ger_de_v02\"\n\nMore Information needed"
] | [
21,
26
] | [
"passage: TAGS\n#language-German #license-cc-by-nc-4.0 #region-us \n# Dataset Card for \"synthetic_RAG_dataset_ger_de_v02\"\n\nMore Information needed"
] |
60566cbd654c88e5ba4f07a3f08c6c6552425f4e | # Dataset Card for "ft-capstone2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | rpadilla/ft-capstone2 | [
"region:us"
] | 2024-02-14T09:27:39+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36223, "num_examples": 12}, {"name": "test", "num_bytes": 23728, "num_examples": 7}], "download_size": 57751, "dataset_size": 59951}} | 2024-02-14T09:27:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ft-capstone2"
More Information needed | [
"# Dataset Card for \"ft-capstone2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ft-capstone2\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ft-capstone2\"\n\nMore Information needed"
] |
88f3946ac34cbd1765896487c3c2c5021f914548 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | systemk/codenet | [
"task_categories:text-generation",
"task_categories:text-classification",
"license:cdla-permissive-2.0",
"code",
"region:us"
] | 2024-02-14T09:30:09+00:00 | {"license": "cdla-permissive-2.0", "task_categories": ["text-generation", "text-classification"], "dataset_info": [{"config_name": "accepted", "features": [{"name": "code", "dtype": "string"}, {"name": "submission_id", "dtype": "string"}, {"name": "problem_id", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "C++", "1": "C", "2": "Java", "3": "Python", "4": "Go", "5": "Ruby", "6": "C#", "7": "OCaml", "8": "Rust", "9": "JavaScript", "10": "PHP", "11": "Scala", "12": "Other"}}}}, {"name": "original_language", "dtype": "string"}, {"name": "filename_ext", "dtype": "string"}, {"name": "status", "dtype": {"class_label": {"names": {"0": "Accepted", "1": "Compile Error", "2": "Runtime Error", "3": "Time Limit Exceeded", "4": "Memory Limit Exceeded", "5": "Wrong Answer", "6": "Other"}}}}, {"name": "cpu_time", "dtype": "int32"}, {"name": "memory", "dtype": "int32"}, {"name": "code_size", "dtype": "int32"}, {"name": "accuracy", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7182604796.348241, "num_examples": 5004663}, {"name": "validation", "num_bytes": 1886090799.288453, "num_examples": 1500962}, {"name": "test", "num_bytes": 1350931397.6121755, "num_examples": 954963}], "download_size": 4841625499, "dataset_size": 10419626993.248869}, {"config_name": "default", "features": [{"name": "code", "dtype": "string"}, {"name": "submission_id", "dtype": "string"}, {"name": "problem_id", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "C++", "1": "C", "2": "Java", "3": "Python", "4": "Go", "5": "Ruby", "6": "C#", "7": "OCaml", "8": "Rust", "9": "JavaScript", "10": "PHP", "11": "Scala", "12": "Other"}}}}, {"name": "original_language", "dtype": "string"}, {"name": "filename_ext", "dtype": "string"}, {"name": "status", "dtype": {"class_label": {"names": {"0": "Accepted", "1": "Compile Error", "2": "Runtime Error", "3": "Time Limit Exceeded", "4": "Memory Limit Exceeded", "5": "Wrong Answer", "6": "Other"}}}}, {"name": "cpu_time", "dtype": "int32"}, {"name": "memory", "dtype": "int32"}, {"name": "code_size", "dtype": "int32"}, {"name": "accuracy", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13719235606, "num_examples": 9559227}, {"name": "validation", "num_bytes": 3300894506, "num_examples": 2626871}, {"name": "test", "num_bytes": 2448421072, "num_examples": 1730770}], "download_size": 7476817454, "dataset_size": 19468551184}, {"config_name": "mini", "features": [{"name": "code", "dtype": "string"}, {"name": "submission_id", "dtype": "string"}, {"name": "problem_id", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "language", "dtype": {"class_label": {"names": {"0": "C++", "1": "C", "2": "Java", "3": "Python", "4": "Go", "5": "Ruby", "6": "C#", "7": "OCaml", "8": "Rust", "9": "JavaScript", "10": "PHP", "11": "Scala", "12": "Other"}}}}, {"name": "original_language", "dtype": "string"}, {"name": "filename_ext", "dtype": "string"}, {"name": "status", "dtype": {"class_label": {"names": {"0": "Accepted", "1": "Compile Error", "2": "Runtime Error", "3": "Time Limit Exceeded", "4": "Memory Limit Exceeded", "5": "Wrong Answer", "6": "Other"}}}}, {"name": "cpu_time", "dtype": "int32"}, {"name": "memory", "dtype": "int32"}, {"name": "code_size", "dtype": "int32"}, {"name": "accuracy", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2821205, "num_examples": 5399}, {"name": "validation", "num_bytes": 1108361, "num_examples": 1200}, {"name": "test", "num_bytes": 1426005, "num_examples": 2225}], "download_size": 1913743, "dataset_size": 5355571}], "configs": [{"config_name": "accepted", "data_files": [{"split": "train", "path": "accepted/train-*"}, {"split": "validation", "path": "accepted/validation-*"}, {"split": "test", "path": "accepted/test-*"}]}, {"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}, {"config_name": "mini", "data_files": [{"split": "train", "path": "mini/train-*"}, {"split": "validation", "path": "mini/validation-*"}, {"split": "test", "path": "mini/test-*"}]}], "tags": ["code"]} | 2024-02-16T05:11:12+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-text-classification #license-cdla-permissive-2.0 #code #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-text-generation #task_categories-text-classification #license-cdla-permissive-2.0 #code #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
42,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-text-classification #license-cdla-permissive-2.0 #code #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
065f065a7a272054aaee3def7db4847cfbb8176a |
# Dataset Card for Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1](https://huggingface.co/mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mzio__hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T09:28:19.032904](https://huggingface.co/datasets/open-llm-leaderboard/details_mzio__hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1/blob/main/results_2024-02-14T09-28-19.032904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2505139408796575,
"acc_stderr": 0.030454510096603406,
"acc_norm": 0.25147973635717114,
"acc_norm_stderr": 0.03126344512636081,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.01546102762725359,
"mc2": 0.4666111070317817,
"mc2_stderr": 0.015662318573580764
},
"harness|arc:challenge|25": {
"acc": 0.17747440273037543,
"acc_stderr": 0.011165138769643956,
"acc_norm": 0.21245733788395904,
"acc_norm_stderr": 0.011953482906582949
},
"harness|hellaswag|10": {
"acc": 0.27076279625572597,
"acc_stderr": 0.0044344567170975825,
"acc_norm": 0.28739294961163114,
"acc_norm_stderr": 0.004516215206715342
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869337,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869337
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.19393939393939394,
"acc_stderr": 0.030874145136562097,
"acc_norm": 0.19393939393939394,
"acc_norm_stderr": 0.030874145136562097
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.017765978652327565,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.017765978652327565
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.19831223628691982,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.19831223628691982,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.023357365785874027,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.023357365785874027
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676655,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.01546102762725359,
"mc2": 0.4666111070317817,
"mc2_stderr": 0.015662318573580764
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076884
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mzio__hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1 | [
"region:us"
] | 2024-02-14T09:30:43+00:00 | {"pretty_name": "Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1](https://huggingface.co/mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mzio__hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T09:28:19.032904](https://huggingface.co/datasets/open-llm-leaderboard/details_mzio__hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1/blob/main/results_2024-02-14T09-28-19.032904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2505139408796575,\n \"acc_stderr\": 0.030454510096603406,\n \"acc_norm\": 0.25147973635717114,\n \"acc_norm_stderr\": 0.03126344512636081,\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.01546102762725359,\n \"mc2\": 0.4666111070317817,\n \"mc2_stderr\": 0.015662318573580764\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.17747440273037543,\n \"acc_stderr\": 0.011165138769643956,\n \"acc_norm\": 0.21245733788395904,\n \"acc_norm_stderr\": 0.011953482906582949\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27076279625572597,\n \"acc_stderr\": 0.0044344567170975825,\n \"acc_norm\": 0.28739294961163114,\n \"acc_norm_stderr\": 0.004516215206715342\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.029957851329869337,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.029957851329869337\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.19393939393939394,\n \"acc_stderr\": 0.030874145136562097,\n \"acc_norm\": 0.19393939393939394,\n \"acc_norm_stderr\": 0.030874145136562097\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2828282828282828,\n \"acc_stderr\": 0.032087795587867514,\n \"acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.032087795587867514\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22018348623853212,\n \"acc_stderr\": 0.017765978652327565,\n \"acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.017765978652327565\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.19831223628691982,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.19831223628691982,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752597,\n \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752597\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874027,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874027\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.02456922360046085,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.02456922360046085\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676655,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676655\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.01546102762725359,\n \"mc2\": 0.4666111070317817,\n \"mc2_stderr\": 0.015662318573580764\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076884\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-28-19.032904.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["**/details_harness|winogrande|5_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T09-28-19.032904.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T09_28_19.032904", "path": ["results_2024-02-14T09-28-19.032904.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T09-28-19.032904.parquet"]}]}]} | 2024-02-14T09:31:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1
Dataset automatically created during the evaluation run of model mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T09:28:19.032904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:28:19.032904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:28:19.032904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
249,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-alpaca_clean_mistral-mistral_7b_lk_esn_tqk_lora-lk_untied_head-lsc_1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T09:28:19.032904(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations"
] |
cf8b0efa806cec6d9fdd53a72ab585224e6e5c9d |
This dataset is consists in performing a judge on the answers from GPT4 and GTP4 Turbo.
It is the judge version of [yleo/emerton_dpo_pairs](https://huggingface.co/datasets/yleo/emerton_dpo_pairs)
To perform the judge, [llm-blender/PairRM](https://huggingface.co/llm-blender/PairRM) is used.
I recommend filtering on chosen_judge_score > 1 to keep only signicative gaps.
| yleo/emerton_dpo_pairs_judge | [
"region:us"
] | 2024-02-14T09:42:13+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "rating", "sequence": "float32"}, {"name": "chosen_judge", "dtype": "string"}, {"name": "rejected_judge", "dtype": "string"}, {"name": "chosen_judge_model", "dtype": "string"}, {"name": "rejected_judge_model", "dtype": "string"}, {"name": "rejected_judge_score", "dtype": "float64"}, {"name": "chosen_judge_score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 38173225, "num_examples": 5489}], "download_size": 21529431, "dataset_size": 38173225}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T12:17:28+00:00 | [] | [] | TAGS
#region-us
|
This dataset is consists in performing a judge on the answers from GPT4 and GTP4 Turbo.
It is the judge version of yleo/emerton_dpo_pairs
To perform the judge, llm-blender/PairRM is used.
I recommend filtering on chosen_judge_score > 1 to keep only signicative gaps.
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
90e17c3ccb433b705dc0c027d1e50899b410eca0 | # Dataset Card for "ft-capstone3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | rpadilla/ft-capstone3 | [
"region:us"
] | 2024-02-14T09:48:19+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 34211, "num_examples": 12}, {"name": "test", "num_bytes": 20456, "num_examples": 7}], "download_size": 58171, "dataset_size": 54667}} | 2024-02-14T09:48:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ft-capstone3"
More Information needed | [
"# Dataset Card for \"ft-capstone3\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ft-capstone3\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ft-capstone3\"\n\nMore Information needed"
] |
f2f2d1f58ab07a7e1889d3401fa6148ef68887eb |
# Dataset Card for Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3](https://huggingface.co/mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mzio__hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T09:47:11.796645](https://huggingface.co/datasets/open-llm-leaderboard/details_mzio__hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3/blob/main/results_2024-02-14T09-47-11.796645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23546482341573058,
"acc_stderr": 0.030106034473533504,
"acc_norm": 0.23527256450662293,
"acc_norm_stderr": 0.0308992047184514,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486714999,
"mc2": 0.5065169026299828,
"mc2_stderr": 0.016573941311572672
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888667,
"acc_norm": 0.23293515358361774,
"acc_norm_stderr": 0.0123525070426174
},
"harness|hellaswag|10": {
"acc": 0.261202947619996,
"acc_stderr": 0.004383925147478736,
"acc_norm": 0.25473013343955386,
"acc_norm_stderr": 0.004348189459336531
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566018,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566018
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388977,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388977
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823792,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.02824735012218027,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.02824735012218027
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128006,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128006
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567976,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567976
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.01690927688493609,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.01690927688493609
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.025130453652268455,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.025130453652268455
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035275,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035275
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225864,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225864
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808836,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.02182342285774495,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.02182342285774495
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543332,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25488917861799215,
"acc_stderr": 0.011130509812662979,
"acc_norm": 0.25488917861799215,
"acc_norm_stderr": 0.011130509812662979
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.02406059942348743,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.02406059942348743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.025000256039546212,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.025000256039546212
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245233,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245233
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486714999,
"mc2": 0.5065169026299828,
"mc2_stderr": 0.016573941311572672
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.014050170094497704
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mzio__hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3 | [
"region:us"
] | 2024-02-14T09:49:28+00:00 | {"pretty_name": "Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3", "dataset_summary": "Dataset automatically created during the evaluation run of model [mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3](https://huggingface.co/mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mzio__hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T09:47:11.796645](https://huggingface.co/datasets/open-llm-leaderboard/details_mzio__hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3/blob/main/results_2024-02-14T09-47-11.796645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23546482341573058,\n \"acc_stderr\": 0.030106034473533504,\n \"acc_norm\": 0.23527256450662293,\n \"acc_norm_stderr\": 0.0308992047184514,\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.015051869486714999,\n \"mc2\": 0.5065169026299828,\n \"mc2_stderr\": 0.016573941311572672\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888667,\n \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.0123525070426174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.261202947619996,\n \"acc_stderr\": 0.004383925147478736,\n \"acc_norm\": 0.25473013343955386,\n \"acc_norm_stderr\": 0.004348189459336531\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566018,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566018\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388977,\n \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388977\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823792,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.02824735012218027,\n \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.02824735012218027\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128006,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128006\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279472,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279472\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567976,\n \"acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567976\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.01690927688493609,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.01690927688493609\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.16203703703703703,\n \"acc_stderr\": 0.025130453652268455,\n \"acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.025130453652268455\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035275,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035275\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225864,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225864\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.23371647509578544,\n \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808836,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808836\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n \"acc_stderr\": 0.02182342285774495,\n \"acc_norm\": 0.18006430868167203,\n \"acc_norm_stderr\": 0.02182342285774495\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543332,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25488917861799215,\n \"acc_stderr\": 0.011130509812662979,\n \"acc_norm\": 0.25488917861799215,\n \"acc_norm_stderr\": 0.011130509812662979\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.02406059942348743,\n \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.02406059942348743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.025000256039546212,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.025000256039546212\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245233,\n \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245233\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n \"mc1_stderr\": 0.015051869486714999,\n \"mc2\": 0.5065169026299828,\n \"mc2_stderr\": 0.016573941311572672\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5090765588003157,\n \"acc_stderr\": 0.014050170094497704\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T09-47-11.796645.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["**/details_harness|winogrande|5_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T09-47-11.796645.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T09_47_11.796645", "path": ["results_2024-02-14T09-47-11.796645.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T09-47-11.796645.parquet"]}]}]} | 2024-02-14T09:49:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3
Dataset automatically created during the evaluation run of model mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T09:47:11.796645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:47:11.796645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T09:47:11.796645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
217,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3\n\n\n\nDataset automatically created during the evaluation run of model mzio/hedgehog-mistral_7b-alpaca_clean-smd_lora_1e_3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T09:47:11.796645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
999f9e38b3bda4d19b8648d90888a5e06f6a8c6e | # Dataset Card for "poc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wisenut-nlp-team/poc | [
"region:us"
] | 2024-02-14T09:50:33+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "sequence": "string"}, {"name": "answer", "sequence": "string"}, {"name": "original_answer", "sequence": "string"}, {"name": "similar_contexts", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 9952454110.0, "num_examples": 529113}], "download_size": 4013475497, "dataset_size": 9952454110.0}} | 2024-02-14T09:57:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "poc"
More Information needed | [
"# Dataset Card for \"poc\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"poc\"\n\nMore Information needed"
] | [
6,
12
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"poc\"\n\nMore Information needed"
] |
65cd779c938be4e307c96447f517e11ab619b629 | ## Auxiliary data (mask, constants, statistics...) and pytorch checkpoints required to reproduce performance of Pangu-Weather at 24-hour forecasting horizon.
| zhaoshan/pangu_pytorch | [
"license:apache-2.0",
"region:us"
] | 2024-02-14T10:04:15+00:00 | {"license": "apache-2.0"} | 2024-02-14T11:12:51+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| ## Auxiliary data (mask, constants, statistics...) and pytorch checkpoints required to reproduce performance of Pangu-Weather at 24-hour forecasting horizon.
| [
"## Auxiliary data (mask, constants, statistics...) and pytorch checkpoints required to reproduce performance of Pangu-Weather at 24-hour forecasting horizon."
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"## Auxiliary data (mask, constants, statistics...) and pytorch checkpoints required to reproduce performance of Pangu-Weather at 24-hour forecasting horizon."
] | [
14,
44
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n## Auxiliary data (mask, constants, statistics...) and pytorch checkpoints required to reproduce performance of Pangu-Weather at 24-hour forecasting horizon."
] |
5efcb701ee61f64d9059b7a6fe70002b8d39ea2a |
# Dataset Card for Evaluation run of yleo/EmertonOmniBeagle-7B-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/EmertonOmniBeagle-7B-dpo](https://huggingface.co/yleo/EmertonOmniBeagle-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__EmertonOmniBeagle-7B-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T10:17:01.661454](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonOmniBeagle-7B-dpo/blob/main/results_2024-02-14T10-17-01.661454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6503063779591207,
"acc_stderr": 0.03221551316026954,
"acc_norm": 0.6499041876908436,
"acc_norm_stderr": 0.03288821519239377,
"mc1": 0.6034271725826194,
"mc1_stderr": 0.01712493094202351,
"mc2": 0.7562392596040229,
"mc2_stderr": 0.01399559383226538
},
"harness|arc:challenge|25": {
"acc": 0.6996587030716723,
"acc_stderr": 0.013395909309957009,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.01301933276263576
},
"harness|hellaswag|10": {
"acc": 0.7077275443138817,
"acc_stderr": 0.0045387734937465465,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.0031910847927931513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6034271725826194,
"mc1_stderr": 0.01712493094202351,
"mc2": 0.7562392596040229,
"mc2_stderr": 0.01399559383226538
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598482
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.01279103722733603
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__EmertonOmniBeagle-7B-dpo | [
"region:us"
] | 2024-02-14T10:19:21+00:00 | {"pretty_name": "Evaluation run of yleo/EmertonOmniBeagle-7B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/EmertonOmniBeagle-7B-dpo](https://huggingface.co/yleo/EmertonOmniBeagle-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__EmertonOmniBeagle-7B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T10:17:01.661454](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonOmniBeagle-7B-dpo/blob/main/results_2024-02-14T10-17-01.661454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6503063779591207,\n \"acc_stderr\": 0.03221551316026954,\n \"acc_norm\": 0.6499041876908436,\n \"acc_norm_stderr\": 0.03288821519239377,\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.01712493094202351,\n \"mc2\": 0.7562392596040229,\n \"mc2_stderr\": 0.01399559383226538\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957009,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.01301933276263576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7077275443138817,\n \"acc_stderr\": 0.0045387734937465465,\n \"acc_norm\": 0.8843855805616411,\n \"acc_norm_stderr\": 0.0031910847927931513\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.01712493094202351,\n \"mc2\": 0.7562392596040229,\n \"mc2_stderr\": 0.01399559383226538\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598482\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \"acc_stderr\": 0.01279103722733603\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/EmertonOmniBeagle-7B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-17-01.661454.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["**/details_harness|winogrande|5_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T10-17-01.661454.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T10_17_01.661454", "path": ["results_2024-02-14T10-17-01.661454.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T10-17-01.661454.parquet"]}]}]} | 2024-02-14T10:19:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/EmertonOmniBeagle-7B-dpo
Dataset automatically created during the evaluation run of model yleo/EmertonOmniBeagle-7B-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T10:17:01.661454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/EmertonOmniBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonOmniBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:17:01.661454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/EmertonOmniBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonOmniBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:17:01.661454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yleo/EmertonOmniBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonOmniBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T10:17:01.661454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
199e075a3065fdc93295d5d67a3b29ab63820112 |
# Dataset Card for Evaluation run of RaduGabriel/MUZD
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/MUZD](https://huggingface.co/RaduGabriel/MUZD) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__MUZD",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T10:37:51.263631](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZD/blob/main/results_2024-02-14T10-37-51.263631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6315569902368788,
"acc_stderr": 0.03257903006074675,
"acc_norm": 0.633414948028734,
"acc_norm_stderr": 0.033238569460214515,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6572688672491508,
"mc2_stderr": 0.014888678305017567
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251102,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880536
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.0046853348440386595,
"acc_norm": 0.8653654650468035,
"acc_norm_stderr": 0.0034063520713417173
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337152,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386424,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386424
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010323,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6572688672491508,
"mc2_stderr": 0.014888678305017567
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676211
},
"harness|gsm8k|5": {
"acc": 0.5860500379075056,
"acc_stderr": 0.013566991960151778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__MUZD | [
"region:us"
] | 2024-02-14T10:40:11+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/MUZD", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/MUZD](https://huggingface.co/RaduGabriel/MUZD) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__MUZD\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T10:37:51.263631](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZD/blob/main/results_2024-02-14T10-37-51.263631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6315569902368788,\n \"acc_stderr\": 0.03257903006074675,\n \"acc_norm\": 0.633414948028734,\n \"acc_norm_stderr\": 0.033238569460214515,\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6572688672491508,\n \"mc2_stderr\": 0.014888678305017567\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251102,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n \"acc_stderr\": 0.0046853348440386595,\n \"acc_norm\": 0.8653654650468035,\n \"acc_norm_stderr\": 0.0034063520713417173\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337152,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337152\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902796,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010323,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6572688672491508,\n \"mc2_stderr\": 0.014888678305017567\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676211\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5860500379075056,\n \"acc_stderr\": 0.013566991960151778\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/MUZD", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["**/details_harness|winogrande|5_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T10-37-51.263631.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T10_37_51.263631", "path": ["results_2024-02-14T10-37-51.263631.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T10-37-51.263631.parquet"]}]}]} | 2024-02-14T10:40:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RaduGabriel/MUZD
Dataset automatically created during the evaluation run of model RaduGabriel/MUZD on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T10:37:51.263631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RaduGabriel/MUZD\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZD on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:37:51.263631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RaduGabriel/MUZD\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZD on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:37:51.263631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of RaduGabriel/MUZD\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/MUZD on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T10:37:51.263631(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
616157785995d8fe7b8a76a797a2d9e5fcd53b7c |
# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T10:44:03.358725](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B/blob/main/results_2024-02-14T10-44-03.358725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501327325533349,
"acc_stderr": 0.032222664885814316,
"acc_norm": 0.6497540751488936,
"acc_norm_stderr": 0.03289485359002978,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7779478264166126,
"mc2_stderr": 0.013764993545897771
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.01332975029338232,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136444
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.0044958914405194205,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.0031117953207879436
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7779478264166126,
"mc2_stderr": 0.013764993545897771
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750038
},
"harness|gsm8k|5": {
"acc": 0.6777862016679302,
"acc_stderr": 0.012872435481188776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B | [
"region:us"
] | 2024-02-14T10:46:24+00:00 | {"pretty_name": "Evaluation run of mlabonne/NeuralMonarch-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/NeuralMonarch-7B](https://huggingface.co/mlabonne/NeuralMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T10:44:03.358725](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralMonarch-7B/blob/main/results_2024-02-14T10-44-03.358725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501327325533349,\n \"acc_stderr\": 0.032222664885814316,\n \"acc_norm\": 0.6497540751488936,\n \"acc_norm_stderr\": 0.03289485359002978,\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7779478264166126,\n \"mc2_stderr\": 0.013764993545897771\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.01332975029338232,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136444\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n \"acc_stderr\": 0.0044958914405194205,\n \"acc_norm\": 0.8908583947420833,\n \"acc_norm_stderr\": 0.0031117953207879436\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\": 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7779478264166126,\n \"mc2_stderr\": 0.013764993545897771\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750038\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6777862016679302,\n \"acc_stderr\": 0.012872435481188776\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/NeuralMonarch-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["**/details_harness|winogrande|5_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T10-44-03.358725.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T10_44_03.358725", "path": ["results_2024-02-14T10-44-03.358725.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T10-44-03.358725.parquet"]}]}]} | 2024-02-14T10:46:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B
Dataset automatically created during the evaluation run of model mlabonne/NeuralMonarch-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T10:44:03.358725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:44:03.358725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T10:44:03.358725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/NeuralMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T10:44:03.358725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8aa50f8c4665b9d1f531eb9e6788056a737bd757 |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 19,717 | 88,648 | 8,710 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
| SauravMaheshkar/pareto-pubmed | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T10:51:12+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T11:05:28+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
32a33defdfa1a9407c9521436e44ed44ed08a704 | # Dataset Card for "wsd_myriade_synth_data_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_v4 | [
"region:us"
] | 2024-02-14T10:57:16+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 57036817, "num_examples": 101321}], "download_size": 0, "dataset_size": 57036817}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T13:36:25+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_v4"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_v4\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_v4\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"wsd_myriade_synth_data_v4\"\n\nMore Information needed"
] |
fbd6e564c4244181de9c0e0bcd843ae1ffebf00e | # Sentiment Classification of Historical Literary in Danish and Norwegian Texts
## Description
This project describes a study on sentiment classification in literary analysis of 19th-century Scandinavian novels by female authors. We create a dataset, train and evaluate sentiment classification methods, and use pre-trained language models to confirm and contest a literary hypothesis that the writing of female authors in that period was characterized by negative sentiment. The dataset and trained models are expected to be valuable for future analysis of historical Danish and Norwegian literary texts.
## Dataset
The dataset is uploaded to the `dataset` directory and is structured as follows:
1. `train_set.txt`: TXT file containing the training set with annotated text for sentiment analysis.
2. `dev_set.txt`: TXT file containing the development set with annotated text for sentiment analysis.
3. `test_set.txt`: TXT file containing the testing set with annotated text for sentiment analysis.
Each file contains two columns (tab separated) where the first column is the sentence and the second column is the sentimen annoation (1=positive, 0=neutral, and 2=negative)
## Usage
To use the dataset and code, follow these steps:
1. Clone or download this GitHub repository.
2. Access the dataset files in the `dataset` directory and the Python code file.
3. Use the dataset files for training, development, and testing of sentiment analysis models in your research or applications.
4. Run the Python code files using your preferred IDE or Python environment to understand how to load, preprocess, and analyze the historical text data.
## License
The dataset and code in this repository are released under the [Creative Commons Attribution 4.0 International license](http://creativecommons.org/licenses/by/4.0/).
## Citation
For more details about the sentiment annoatation and classification, please read further in [the following paper](https://openreview.net/forum?id=dszKbb2GH3):
```
@inproceedings{allaith2023sentiment,
title={Sentiment Classification of Historical Literary in {D}anish and {N}orwegian Texts},
author={Ali Al-Laith and Kirstine Nielsen Degn and Alexander Conroy and Bolette S. Pedersen and Jens Bjerring-Hansen and Daniel Hershcovich},
booktitle={The 24rd Nordic Conference on Computational Linguistics},
year={2023},
url={https://openreview.net/forum?id=dszKbb2GH3}
}
``` | MiMe-MeMo/MeMo-Dataset-SA | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | 2024-02-14T11:01:42+00:00 | {"license": "cc-by-nc-nd-4.0"} | 2024-02-14T11:08:07+00:00 | [] | [] | TAGS
#license-cc-by-nc-nd-4.0 #region-us
| # Sentiment Classification of Historical Literary in Danish and Norwegian Texts
## Description
This project describes a study on sentiment classification in literary analysis of 19th-century Scandinavian novels by female authors. We create a dataset, train and evaluate sentiment classification methods, and use pre-trained language models to confirm and contest a literary hypothesis that the writing of female authors in that period was characterized by negative sentiment. The dataset and trained models are expected to be valuable for future analysis of historical Danish and Norwegian literary texts.
## Dataset
The dataset is uploaded to the 'dataset' directory and is structured as follows:
1. 'train_set.txt': TXT file containing the training set with annotated text for sentiment analysis.
2. 'dev_set.txt': TXT file containing the development set with annotated text for sentiment analysis.
3. 'test_set.txt': TXT file containing the testing set with annotated text for sentiment analysis.
Each file contains two columns (tab separated) where the first column is the sentence and the second column is the sentimen annoation (1=positive, 0=neutral, and 2=negative)
## Usage
To use the dataset and code, follow these steps:
1. Clone or download this GitHub repository.
2. Access the dataset files in the 'dataset' directory and the Python code file.
3. Use the dataset files for training, development, and testing of sentiment analysis models in your research or applications.
4. Run the Python code files using your preferred IDE or Python environment to understand how to load, preprocess, and analyze the historical text data.
## License
The dataset and code in this repository are released under the Creative Commons Attribution 4.0 International license.
For more details about the sentiment annoatation and classification, please read further in the following paper:
| [
"# Sentiment Classification of Historical Literary in Danish and Norwegian Texts",
"## Description\nThis project describes a study on sentiment classification in literary analysis of 19th-century Scandinavian novels by female authors. We create a dataset, train and evaluate sentiment classification methods, and use pre-trained language models to confirm and contest a literary hypothesis that the writing of female authors in that period was characterized by negative sentiment. The dataset and trained models are expected to be valuable for future analysis of historical Danish and Norwegian literary texts.",
"## Dataset\nThe dataset is uploaded to the 'dataset' directory and is structured as follows:\n\n1. 'train_set.txt': TXT file containing the training set with annotated text for sentiment analysis.\n2. 'dev_set.txt': TXT file containing the development set with annotated text for sentiment analysis.\n3. 'test_set.txt': TXT file containing the testing set with annotated text for sentiment analysis.\n\n\nEach file contains two columns (tab separated) where the first column is the sentence and the second column is the sentimen annoation (1=positive, 0=neutral, and 2=negative)",
"## Usage\nTo use the dataset and code, follow these steps:\n\n1. Clone or download this GitHub repository.\n2. Access the dataset files in the 'dataset' directory and the Python code file.\n3. Use the dataset files for training, development, and testing of sentiment analysis models in your research or applications.\n4. Run the Python code files using your preferred IDE or Python environment to understand how to load, preprocess, and analyze the historical text data.",
"## License\nThe dataset and code in this repository are released under the Creative Commons Attribution 4.0 International license.\n\nFor more details about the sentiment annoatation and classification, please read further in the following paper:"
] | [
"TAGS\n#license-cc-by-nc-nd-4.0 #region-us \n",
"# Sentiment Classification of Historical Literary in Danish and Norwegian Texts",
"## Description\nThis project describes a study on sentiment classification in literary analysis of 19th-century Scandinavian novels by female authors. We create a dataset, train and evaluate sentiment classification methods, and use pre-trained language models to confirm and contest a literary hypothesis that the writing of female authors in that period was characterized by negative sentiment. The dataset and trained models are expected to be valuable for future analysis of historical Danish and Norwegian literary texts.",
"## Dataset\nThe dataset is uploaded to the 'dataset' directory and is structured as follows:\n\n1. 'train_set.txt': TXT file containing the training set with annotated text for sentiment analysis.\n2. 'dev_set.txt': TXT file containing the development set with annotated text for sentiment analysis.\n3. 'test_set.txt': TXT file containing the testing set with annotated text for sentiment analysis.\n\n\nEach file contains two columns (tab separated) where the first column is the sentence and the second column is the sentimen annoation (1=positive, 0=neutral, and 2=negative)",
"## Usage\nTo use the dataset and code, follow these steps:\n\n1. Clone or download this GitHub repository.\n2. Access the dataset files in the 'dataset' directory and the Python code file.\n3. Use the dataset files for training, development, and testing of sentiment analysis models in your research or applications.\n4. Run the Python code files using your preferred IDE or Python environment to understand how to load, preprocess, and analyze the historical text data.",
"## License\nThe dataset and code in this repository are released under the Creative Commons Attribution 4.0 International license.\n\nFor more details about the sentiment annoatation and classification, please read further in the following paper:"
] | [
19,
16,
105,
153,
104,
44
] | [
"passage: TAGS\n#license-cc-by-nc-nd-4.0 #region-us \n# Sentiment Classification of Historical Literary in Danish and Norwegian Texts## Description\nThis project describes a study on sentiment classification in literary analysis of 19th-century Scandinavian novels by female authors. We create a dataset, train and evaluate sentiment classification methods, and use pre-trained language models to confirm and contest a literary hypothesis that the writing of female authors in that period was characterized by negative sentiment. The dataset and trained models are expected to be valuable for future analysis of historical Danish and Norwegian literary texts.## Dataset\nThe dataset is uploaded to the 'dataset' directory and is structured as follows:\n\n1. 'train_set.txt': TXT file containing the training set with annotated text for sentiment analysis.\n2. 'dev_set.txt': TXT file containing the development set with annotated text for sentiment analysis.\n3. 'test_set.txt': TXT file containing the testing set with annotated text for sentiment analysis.\n\n\nEach file contains two columns (tab separated) where the first column is the sentence and the second column is the sentimen annoation (1=positive, 0=neutral, and 2=negative)## Usage\nTo use the dataset and code, follow these steps:\n\n1. Clone or download this GitHub repository.\n2. Access the dataset files in the 'dataset' directory and the Python code file.\n3. Use the dataset files for training, development, and testing of sentiment analysis models in your research or applications.\n4. Run the Python code files using your preferred IDE or Python environment to understand how to load, preprocess, and analyze the historical text data.## License\nThe dataset and code in this repository are released under the Creative Commons Attribution 4.0 International license.\n\nFor more details about the sentiment annoatation and classification, please read further in the following paper:"
] |
a96466594a6340e7e1450a228c3a912f74df8a5d |
A small subset of https://huggingface.co/datasets/jondurbin/truthy-dpo-v0.1 with rating scores added to each row using distilabel's preference dataset cleaning example. | eren23/distilabel_first100_truthy-dpo-v0.1 | [
"task_categories:question-answering",
"task_categories:text-generation",
"language:en",
"dpo",
"preference-optimization",
"region:us"
] | 2024-02-14T11:15:41+00:00 | {"language": ["en"], "task_categories": ["question-answering", "text-generation"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "order", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 648583, "num_examples": 100}], "download_size": 330035, "dataset_size": 648583}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["dpo", "preference-optimization"]} | 2024-02-14T11:43:06+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #language-English #dpo #preference-optimization #region-us
|
A small subset of URL with rating scores added to each row using distilabel's preference dataset cleaning example. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #language-English #dpo #preference-optimization #region-us \n"
] | [
42
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-text-generation #language-English #dpo #preference-optimization #region-us \n"
] |
d724c595791ee3039e2dc46afab5e9edc099fbb1 |
# Dataset Card for DocLayNet large without image
## About this card (02/14/2024)
### Property and license
All information from this page but the content of this paragraph "About this card (02/14/2025)" has been copied/pasted from [Dataset Card for DocLayNet](https://huggingface.co/datasets/ds4sd/DocLayNet).
DocLayNet is a dataset created by Deep Search (IBM Research) published under [license CDLA-Permissive-1.0](https://huggingface.co/datasets/ds4sd/DocLayNet#licensing-information).
I do not claim any rights to the data taken from this dataset and published on this page.
# Dataset Card for DocLayNet
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://developer.ibm.com/exchanges/data/all/doclaynet/
- **Repository:** https://github.com/DS4SD/DocLayNet
- **Paper:** https://doi.org/10.1145/3534678.3539043
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
DocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:
1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout
2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals
3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.
4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models
5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.
### Supported Tasks and Leaderboards
We are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see https://ds4sd.github.io/icdar23-doclaynet/.
## Dataset Structure
### Data Fields
DocLayNet provides four types of data assets:
1. Bounding-box annotations in COCO format for each PNG image
2. Extra: Single-page PDF files matching each PNG image
3. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content
The COCO image record are defined like this example
```js
...
{
"id": 1,
"width": 1025,
"height": 1025,
"file_name": "132a855ee8b23533d8ae69af0049c038171a06ddfcac892c3c6d7e6b4091c642.png",
// Custom fields:
"doc_category": "financial_reports" // high-level document category
"collection": "ann_reports_00_04_fancy", // sub-collection name
"doc_name": "NASDAQ_FFIN_2002.pdf", // original document filename
"page_no": 9, // page number in original document
"precedence": 0, // Annotation order, non-zero in case of redundant double- or triple-annotation
},
...
```
The `doc_category` field uses one of the following constants:
```
financial_reports,
scientific_articles,
laws_and_regulations,
government_tenders,
manuals,
patents
```
### Data Splits
The dataset provides three splits
- `train`
- `val`
- `test`
## Dataset Creation
### Annotations
#### Annotation process
The labeling guideline used for training of the annotation experts are available at [DocLayNet_Labeling_Guide_Public.pdf](https://raw.githubusercontent.com/DS4SD/DocLayNet/main/assets/DocLayNet_Labeling_Guide_Public.pdf).
#### Who are the annotators?
Annotations are crowdsourced.
## Additional Information
### Dataset Curators
The dataset is curated by the [Deep Search team](https://ds4sd.github.io/) at IBM Research.
You can contact us at [[email protected]](mailto:[email protected]).
Curators:
- Christoph Auer, [@cau-git](https://github.com/cau-git)
- Michele Dolfi, [@dolfim-ibm](https://github.com/dolfim-ibm)
- Ahmed Nassar, [@nassarofficial](https://github.com/nassarofficial)
- Peter Staar, [@PeterStaar-IBM](https://github.com/PeterStaar-IBM)
### Licensing Information
License: [CDLA-Permissive-1.0](https://cdla.io/permissive-1-0/)
### Citation Information
```bib
@article{doclaynet2022,
title = {DocLayNet: A Large Human-Annotated Dataset for Document-Layout Segmentation},
doi = {10.1145/3534678.353904},
url = {https://doi.org/10.1145/3534678.3539043},
author = {Pfitzmann, Birgit and Auer, Christoph and Dolfi, Michele and Nassar, Ahmed S and Staar, Peter W J},
year = {2022},
isbn = {9781450393850},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
booktitle = {Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
pages = {3743–3751},
numpages = {9},
location = {Washington DC, USA},
series = {KDD '22}
}
```
### Contributions
Thanks to [@dolfim-ibm](https://github.com/dolfim-ibm), [@cau-git](https://github.com/cau-git) for adding this dataset. | agomberto/DoCLayNet-large-wt-image | [
"task_categories:object-detection",
"task_categories:image-segmentation",
"task_categories:token-classification",
"task_ids:instance-segmentation",
"annotations_creators:crowdsourced",
"size_categories:10K<n<100K",
"language:en",
"language:de",
"language:fr",
"language:ja",
"license:other",
"DocLayNet",
"COCO",
"PDF",
"IBM",
"Financial-Reports",
"Finance",
"Manuals",
"Scientific-Articles",
"Science",
"Laws",
"Law",
"Regulations",
"Patents",
"Government-Tenders",
"object-detection",
"image-segmentation",
"token-classification",
"region:us"
] | 2024-02-14T11:15:41+00:00 | {"annotations_creators": ["crowdsourced"], "language": ["en", "de", "fr", "ja"], "license": "other", "size_categories": ["10K<n<100K"], "task_categories": ["object-detection", "image-segmentation", "token-classification"], "task_ids": ["instance-segmentation"], "pretty_name": "DocLayNet large", "tags": ["DocLayNet", "COCO", "PDF", "IBM", "Financial-Reports", "Finance", "Manuals", "Scientific-Articles", "Science", "Laws", "Law", "Regulations", "Patents", "Government-Tenders", "object-detection", "image-segmentation", "token-classification"]} | 2024-02-14T11:21:20+00:00 | [] | [
"en",
"de",
"fr",
"ja"
] | TAGS
#task_categories-object-detection #task_categories-image-segmentation #task_categories-token-classification #task_ids-instance-segmentation #annotations_creators-crowdsourced #size_categories-10K<n<100K #language-English #language-German #language-French #language-Japanese #license-other #DocLayNet #COCO #PDF #IBM #Financial-Reports #Finance #Manuals #Scientific-Articles #Science #Laws #Law #Regulations #Patents #Government-Tenders #object-detection #image-segmentation #token-classification #region-us
|
# Dataset Card for DocLayNet large without image
## About this card (02/14/2024)
### Property and license
All information from this page but the content of this paragraph "About this card (02/14/2025)" has been copied/pasted from Dataset Card for DocLayNet.
DocLayNet is a dataset created by Deep Search (IBM Research) published under license CDLA-Permissive-1.0.
I do not claim any rights to the data taken from this dataset and published on this page.
# Dataset Card for DocLayNet
## Table of Contents
- Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Dataset Structure
- Data Fields
- Data Splits
- Dataset Creation
- Annotations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
- Leaderboard:
- Point of Contact:
### Dataset Summary
DocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:
1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout
2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals
3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.
4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models
5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.
### Supported Tasks and Leaderboards
We are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see URL
## Dataset Structure
### Data Fields
DocLayNet provides four types of data assets:
1. Bounding-box annotations in COCO format for each PNG image
2. Extra: Single-page PDF files matching each PNG image
3. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content
The COCO image record are defined like this example
The 'doc_category' field uses one of the following constants:
### Data Splits
The dataset provides three splits
- 'train'
- 'val'
- 'test'
## Dataset Creation
### Annotations
#### Annotation process
The labeling guideline used for training of the annotation experts are available at DocLayNet_Labeling_Guide_Public.pdf.
#### Who are the annotators?
Annotations are crowdsourced.
## Additional Information
### Dataset Curators
The dataset is curated by the Deep Search team at IBM Research.
You can contact us at deepsearch-core@URL.
Curators:
- Christoph Auer, @cau-git
- Michele Dolfi, @dolfim-ibm
- Ahmed Nassar, @nassarofficial
- Peter Staar, @PeterStaar-IBM
### Licensing Information
License: CDLA-Permissive-1.0
### Contributions
Thanks to @dolfim-ibm, @cau-git for adding this dataset. | [
"# Dataset Card for DocLayNet large without image",
"## About this card (02/14/2024)",
"### Property and license\n\nAll information from this page but the content of this paragraph \"About this card (02/14/2025)\" has been copied/pasted from Dataset Card for DocLayNet.\n\nDocLayNet is a dataset created by Deep Search (IBM Research) published under license CDLA-Permissive-1.0. \n\nI do not claim any rights to the data taken from this dataset and published on this page.",
"# Dataset Card for DocLayNet",
"## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n- Dataset Structure\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Annotations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard:\n- Point of Contact:",
"### Dataset Summary\n\nDocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:\n\n1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout\n2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals\n3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.\n4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models\n5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.",
"### Supported Tasks and Leaderboards\n\nWe are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see URL",
"## Dataset Structure",
"### Data Fields\n\nDocLayNet provides four types of data assets:\n\n1. Bounding-box annotations in COCO format for each PNG image\n2. Extra: Single-page PDF files matching each PNG image\n3. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content\n\nThe COCO image record are defined like this example\n\n\n\nThe 'doc_category' field uses one of the following constants:",
"### Data Splits\n\nThe dataset provides three splits\n- 'train'\n- 'val'\n- 'test'",
"## Dataset Creation",
"### Annotations",
"#### Annotation process\n\nThe labeling guideline used for training of the annotation experts are available at DocLayNet_Labeling_Guide_Public.pdf.",
"#### Who are the annotators?\n\nAnnotations are crowdsourced.",
"## Additional Information",
"### Dataset Curators\n\nThe dataset is curated by the Deep Search team at IBM Research.\nYou can contact us at deepsearch-core@URL.\n\nCurators:\n- Christoph Auer, @cau-git\n- Michele Dolfi, @dolfim-ibm\n- Ahmed Nassar, @nassarofficial\n- Peter Staar, @PeterStaar-IBM",
"### Licensing Information\n\nLicense: CDLA-Permissive-1.0",
"### Contributions\n\nThanks to @dolfim-ibm, @cau-git for adding this dataset."
] | [
"TAGS\n#task_categories-object-detection #task_categories-image-segmentation #task_categories-token-classification #task_ids-instance-segmentation #annotations_creators-crowdsourced #size_categories-10K<n<100K #language-English #language-German #language-French #language-Japanese #license-other #DocLayNet #COCO #PDF #IBM #Financial-Reports #Finance #Manuals #Scientific-Articles #Science #Laws #Law #Regulations #Patents #Government-Tenders #object-detection #image-segmentation #token-classification #region-us \n",
"# Dataset Card for DocLayNet large without image",
"## About this card (02/14/2024)",
"### Property and license\n\nAll information from this page but the content of this paragraph \"About this card (02/14/2025)\" has been copied/pasted from Dataset Card for DocLayNet.\n\nDocLayNet is a dataset created by Deep Search (IBM Research) published under license CDLA-Permissive-1.0. \n\nI do not claim any rights to the data taken from this dataset and published on this page.",
"# Dataset Card for DocLayNet",
"## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n- Dataset Structure\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Annotations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard:\n- Point of Contact:",
"### Dataset Summary\n\nDocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:\n\n1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout\n2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals\n3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.\n4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models\n5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.",
"### Supported Tasks and Leaderboards\n\nWe are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see URL",
"## Dataset Structure",
"### Data Fields\n\nDocLayNet provides four types of data assets:\n\n1. Bounding-box annotations in COCO format for each PNG image\n2. Extra: Single-page PDF files matching each PNG image\n3. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content\n\nThe COCO image record are defined like this example\n\n\n\nThe 'doc_category' field uses one of the following constants:",
"### Data Splits\n\nThe dataset provides three splits\n- 'train'\n- 'val'\n- 'test'",
"## Dataset Creation",
"### Annotations",
"#### Annotation process\n\nThe labeling guideline used for training of the annotation experts are available at DocLayNet_Labeling_Guide_Public.pdf.",
"#### Who are the annotators?\n\nAnnotations are crowdsourced.",
"## Additional Information",
"### Dataset Curators\n\nThe dataset is curated by the Deep Search team at IBM Research.\nYou can contact us at deepsearch-core@URL.\n\nCurators:\n- Christoph Auer, @cau-git\n- Michele Dolfi, @dolfim-ibm\n- Ahmed Nassar, @nassarofficial\n- Peter Staar, @PeterStaar-IBM",
"### Licensing Information\n\nLicense: CDLA-Permissive-1.0",
"### Contributions\n\nThanks to @dolfim-ibm, @cau-git for adding this dataset."
] | [
177,
12,
9,
93,
9,
74,
27,
296,
34,
6,
100,
25,
5,
5,
36,
17,
5,
80,
16,
26
] | [
"passage: TAGS\n#task_categories-object-detection #task_categories-image-segmentation #task_categories-token-classification #task_ids-instance-segmentation #annotations_creators-crowdsourced #size_categories-10K<n<100K #language-English #language-German #language-French #language-Japanese #license-other #DocLayNet #COCO #PDF #IBM #Financial-Reports #Finance #Manuals #Scientific-Articles #Science #Laws #Law #Regulations #Patents #Government-Tenders #object-detection #image-segmentation #token-classification #region-us \n# Dataset Card for DocLayNet large without image## About this card (02/14/2024)### Property and license\n\nAll information from this page but the content of this paragraph \"About this card (02/14/2025)\" has been copied/pasted from Dataset Card for DocLayNet.\n\nDocLayNet is a dataset created by Deep Search (IBM Research) published under license CDLA-Permissive-1.0. \n\nI do not claim any rights to the data taken from this dataset and published on this page.# Dataset Card for DocLayNet## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n- Dataset Structure\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Annotations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard:\n- Point of Contact:"
] |
d851e07c6d9e700dc0c4a94eada32c674906b5e7 |
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B](https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:27:51.194235](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B/blob/main/results_2024-02-14T11-27-51.194235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23082125681978313,
"acc_stderr": 0.02986949959492494,
"acc_norm": 0.23087014880014953,
"acc_norm_stderr": 0.030656588530011887,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48654521547048707,
"mc2_stderr": 0.01630952029889674
},
"harness|arc:challenge|25": {
"acc": 0.2150170648464164,
"acc_stderr": 0.012005717634133611,
"acc_norm": 0.257679180887372,
"acc_norm_stderr": 0.012780770562768407
},
"harness|hellaswag|10": {
"acc": 0.25652260505875324,
"acc_stderr": 0.004358210689442257,
"acc_norm": 0.2523401712806214,
"acc_norm_stderr": 0.00433467695270386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066656,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066656
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895678,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895678
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788137,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788137
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560493,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560493
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905706,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.021823422857744953,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.021823422857744953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48654521547048707,
"mc2_stderr": 0.01630952029889674
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616445
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B | [
"region:us"
] | 2024-02-14T11:29:57+00:00 | {"pretty_name": "Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B](https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T11:27:51.194235](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B/blob/main/results_2024-02-14T11-27-51.194235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23082125681978313,\n \"acc_stderr\": 0.02986949959492494,\n \"acc_norm\": 0.23087014880014953,\n \"acc_norm_stderr\": 0.030656588530011887,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48654521547048707,\n \"mc2_stderr\": 0.01630952029889674\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2150170648464164,\n \"acc_stderr\": 0.012005717634133611,\n \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768407\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25652260505875324,\n \"acc_stderr\": 0.004358210689442257,\n \"acc_norm\": 0.2523401712806214,\n \"acc_norm_stderr\": 0.00433467695270386\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066656,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066656\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895678,\n \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895678\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.030299574664788137,\n \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.030299574664788137\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560493,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560493\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n \"acc_stderr\": 0.015104550008905706,\n \"acc_norm\": 0.23243933588761176,\n \"acc_norm_stderr\": 0.015104550008905706\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48654521547048707,\n \"mc2_stderr\": 0.01630952029889674\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616445\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["**/details_harness|winogrande|5_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T11-27-51.194235.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T11_27_51.194235", "path": ["results_2024-02-14T11-27-51.194235.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T11-27-51.194235.parquet"]}]}]} | 2024-02-14T11:30:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B
Dataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T11:27:51.194235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:27:51.194235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:27:51.194235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B\n\n\n\nDataset automatically created during the evaluation run of model TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T11:27:51.194235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
d974696c8beb59fe5d754bdcd734f3ed182dd745 |
Label is used to give a context to the related text using the following map :
- 0 --> "PATIENT"
- 1 --> "DOCTOR"
- 2 --> "NEUTRAL" | LukeGPT88/text-classification-dataset | [
"region:us"
] | 2024-02-14T11:40:15+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2167613, "num_examples": 24746}, {"name": "validation", "num_bytes": 712512, "num_examples": 8249}, {"name": "test", "num_bytes": 716933, "num_examples": 8249}], "download_size": 2372348, "dataset_size": 3597058}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T10:03:55+00:00 | [] | [] | TAGS
#region-us
|
Label is used to give a context to the related text using the following map :
- 0 --> "PATIENT"
- 1 --> "DOCTOR"
- 2 --> "NEUTRAL" | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
c299bf89daac80c858a0a2396cd077be9fd88e53 |
# Dataset Card for ShareLM💬
<!-- Provide a quick summary of the dataset. -->
This dataset contains users interactions from various LLMs and platforms, organized into a unified format.
## What is ShareLM?
ShareLM is a chrome plugin that makes it easy for you to contribute your own human-model interactions.
The Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community 💬🥳
The conversations are released here with the most permissive restriction allowed by the specific model.
## Existing Datasets
In addition to the ShareLM data, we include here data from other great human-model interaction datasets:
- **Collective Cognition** https://huggingface.co/datasets/CollectiveCognition/chats-data-2023-10-16?row=11
- **hh rlhf** https://huggingface.co/datasets/Anthropic/hh-rlhf
- **babi** https://github.com/facebookarchive/bAbI-tasks
- **self-feeding** https://parl.ai/projects/self_feeding/
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | shachardon/ShareLM | [
"task_categories:conversational",
"size_categories:1M<n<10M",
"language:en",
"license:mit",
"region:us"
] | 2024-02-14T11:41:28+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["conversational"], "pretty_name": "ShareLM", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": ["collective_cognition_formatted.csv", "hh_rlhf_formatted.csv", "babi_formatted.csv", "self_feeding_formatted.csv"]}]}]} | 2024-02-14T12:55:47+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #size_categories-1M<n<10M #language-English #license-mit #region-us
|
# Dataset Card for ShareLM
This dataset contains users interactions from various LLMs and platforms, organized into a unified format.
## What is ShareLM?
ShareLM is a chrome plugin that makes it easy for you to contribute your own human-model interactions.
The Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community
The conversations are released here with the most permissive restriction allowed by the specific model.
## Existing Datasets
In addition to the ShareLM data, we include here data from other great human-model interaction datasets:
- Collective Cognition URL
- hh rlhf URL
- babi URL
- self-feeding URL
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for ShareLM\n\n\n\nThis dataset contains users interactions from various LLMs and platforms, organized into a unified format.",
"## What is ShareLM?\nShareLM is a chrome plugin that makes it easy for you to contribute your own human-model interactions.\n\nThe Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community \n\nThe conversations are released here with the most permissive restriction allowed by the specific model.",
"## Existing Datasets\nIn addition to the ShareLM data, we include here data from other great human-model interaction datasets:\n\n- Collective Cognition URL\n- hh rlhf URL\n- babi URL\n- self-feeding URL",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-conversational #size_categories-1M<n<10M #language-English #license-mit #region-us \n",
"# Dataset Card for ShareLM\n\n\n\nThis dataset contains users interactions from various LLMs and platforms, organized into a unified format.",
"## What is ShareLM?\nShareLM is a chrome plugin that makes it easy for you to contribute your own human-model interactions.\n\nThe Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community \n\nThe conversations are released here with the most permissive restriction allowed by the specific model.",
"## Existing Datasets\nIn addition to the ShareLM data, we include here data from other great human-model interaction datasets:\n\n- Collective Cognition URL\n- hh rlhf URL\n- babi URL\n- self-feeding URL",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
37,
32,
73,
53,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#task_categories-conversational #size_categories-1M<n<10M #language-English #license-mit #region-us \n# Dataset Card for ShareLM\n\n\n\nThis dataset contains users interactions from various LLMs and platforms, organized into a unified format.## What is ShareLM?\nShareLM is a chrome plugin that makes it easy for you to contribute your own human-model interactions.\n\nThe Goal -> Collecting an ever-growing dataset of conversations, for the benefit of the open-source community \n\nThe conversations are released here with the most permissive restriction allowed by the specific model.## Existing Datasets\nIn addition to the ShareLM data, we include here data from other great human-model interaction datasets:\n\n- Collective Cognition URL\n- hh rlhf URL\n- babi URL\n- self-feeding URL## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b2747b53a47e85393b2e80588fb20d226f114b68 |
## Ukrainian Toxicity Dataset (translated)
Additionaly to the twitter filtered [data](https://huggingface.co/datasets/ukr-detect/ukr-toxicity-dataset), we provide translated English Jigsaw Toxicity Classification Dataset to Ukrainian.
## Dataset formation:
1. English data source: https://www.kaggle.com/competitions/jigsaw-toxic-comment-classification-challenge/
2. Working with data to get only two labels: a toxic and a non-toxic sentence.
3. Translation into Ukrainian language using model: https://huggingface.co/Helsinki-NLP/opus-mt-en-uk
Labels: 0 - non-toxic, 1 - toxic.
## Load dataset:
```
from datasets import load_dataset
dataset = load_dataset("ukr-detect/ukr-toxicity-dataset-translated-jigsaw")
``` | ukr-detect/ukr-toxicity-dataset-translated-jigsaw | [
"license:openrail++",
"region:us"
] | 2024-02-14T11:46:55+00:00 | {"license": "openrail++", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "labels", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 40682670, "num_examples": 128549}, {"name": "test", "num_bytes": 15661720, "num_examples": 52294}], "download_size": 29856802, "dataset_size": 56344390}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T05:52:51+00:00 | [] | [] | TAGS
#license-openrail++ #region-us
|
## Ukrainian Toxicity Dataset (translated)
Additionaly to the twitter filtered data, we provide translated English Jigsaw Toxicity Classification Dataset to Ukrainian.
## Dataset formation:
1. English data source: URL
2. Working with data to get only two labels: a toxic and a non-toxic sentence.
3. Translation into Ukrainian language using model: URL
Labels: 0 - non-toxic, 1 - toxic.
## Load dataset:
| [
"## Ukrainian Toxicity Dataset (translated)\nAdditionaly to the twitter filtered data, we provide translated English Jigsaw Toxicity Classification Dataset to Ukrainian.",
"## Dataset formation:\n1. English data source: URL\n2. Working with data to get only two labels: a toxic and a non-toxic sentence.\n3. Translation into Ukrainian language using model: URL\n\nLabels: 0 - non-toxic, 1 - toxic.",
"## Load dataset:"
] | [
"TAGS\n#license-openrail++ #region-us \n",
"## Ukrainian Toxicity Dataset (translated)\nAdditionaly to the twitter filtered data, we provide translated English Jigsaw Toxicity Classification Dataset to Ukrainian.",
"## Dataset formation:\n1. English data source: URL\n2. Working with data to get only two labels: a toxic and a non-toxic sentence.\n3. Translation into Ukrainian language using model: URL\n\nLabels: 0 - non-toxic, 1 - toxic.",
"## Load dataset:"
] | [
13,
44,
56,
6
] | [
"passage: TAGS\n#license-openrail++ #region-us \n## Ukrainian Toxicity Dataset (translated)\nAdditionaly to the twitter filtered data, we provide translated English Jigsaw Toxicity Classification Dataset to Ukrainian.## Dataset formation:\n1. English data source: URL\n2. Working with data to get only two labels: a toxic and a non-toxic sentence.\n3. Translation into Ukrainian language using model: URL\n\nLabels: 0 - non-toxic, 1 - toxic.## Load dataset:"
] |
e7e6d88a64101f389871dfc94d3a76614fbf2a3e |
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.1](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:50:03.919128](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1/blob/main/results_2024-02-14T11-50-03.919128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535379230152994,
"acc_stderr": 0.03198200076203346,
"acc_norm": 0.6529860127972825,
"acc_norm_stderr": 0.032649840332740133,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7592071044175611,
"mc2_stderr": 0.01411814026868143
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7173869747062338,
"acc_stderr": 0.0044934958720001085,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280296
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.016643307372315872,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.016643307372315872
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7592071044175611,
"mc2_stderr": 0.01411814026868143
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272955
},
"harness|gsm8k|5": {
"acc": 0.6830932524639879,
"acc_stderr": 0.012815868296721362
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1 | [
"region:us"
] | 2024-02-14T11:52:23+00:00 | {"pretty_name": "Evaluation run of bardsai/jaskier-7b-dpo-v4.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.1](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T11:50:03.919128](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.1/blob/main/results_2024-02-14T11-50-03.919128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535379230152994,\n \"acc_stderr\": 0.03198200076203346,\n \"acc_norm\": 0.6529860127972825,\n \"acc_norm_stderr\": 0.032649840332740133,\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7592071044175611,\n \"mc2_stderr\": 0.01411814026868143\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7173869747062338,\n \"acc_stderr\": 0.0044934958720001085,\n \"acc_norm\": 0.8906592312288388,\n \"acc_norm_stderr\": 0.0031142850772280296\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.0154808268653743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n \"acc_stderr\": 0.016643307372315872,\n \"acc_norm\": 0.45139664804469276,\n \"acc_norm_stderr\": 0.016643307372315872\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7592071044175611,\n \"mc2_stderr\": 0.01411814026868143\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \"acc_stderr\": 0.012815868296721362\n }\n}\n```", "repo_url": "https://huggingface.co/bardsai/jaskier-7b-dpo-v4.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["**/details_harness|winogrande|5_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T11-50-03.919128.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T11_50_03.919128", "path": ["results_2024-02-14T11-50-03.919128.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T11-50-03.919128.parquet"]}]}]} | 2024-02-14T11:52:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1
Dataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T11:50:03.919128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:50:03.919128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:50:03.919128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.1\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T11:50:03.919128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
671d1e6bbf45a74ef21af351fd4ef7b32b7856f8 | ## Ukrainian Formality Dataset (translated)
We obtained the first of its kind Ukrainian Formality Classification dataset by trainslating English GYAFC data.
## Dataset formation:
1. English data source: https://aclanthology.org/N18-1012/
2. Translation into Ukrainian language using model: https://huggingface.co/facebook/nllb-200-distilled-600M
3. Additionally, the dataset was balanced.
Labels: 0 - informal, 1 - formal.
## Load dataset:
```
from datasets import load_dataset
dataset = load_dataset("ukr-detect/ukr-formality-dataset-translated-gyafc")
``` | ukr-detect/ukr-formality-dataset-translated-gyafc | [
"task_categories:text-classification",
"language:uk",
"license:openrail++",
"region:us"
] | 2024-02-14T11:52:25+00:00 | {"language": ["uk"], "license": "openrail++", "task_categories": ["text-classification"], "pretty_name": "ukr-fomalit", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "labels", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 21864433, "num_examples": 209124}, {"name": "validation", "num_bytes": 1066875, "num_examples": 10272}, {"name": "test", "num_bytes": 512199, "num_examples": 4853}], "download_size": 11963779, "dataset_size": 23443507}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T05:57:50+00:00 | [] | [
"uk"
] | TAGS
#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us
| ## Ukrainian Formality Dataset (translated)
We obtained the first of its kind Ukrainian Formality Classification dataset by trainslating English GYAFC data.
## Dataset formation:
1. English data source: URL
2. Translation into Ukrainian language using model: URL
3. Additionally, the dataset was balanced.
Labels: 0 - informal, 1 - formal.
## Load dataset:
| [
"## Ukrainian Formality Dataset (translated)\nWe obtained the first of its kind Ukrainian Formality Classification dataset by trainslating English GYAFC data.",
"## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n3. Additionally, the dataset was balanced.\n\nLabels: 0 - informal, 1 - formal.",
"## Load dataset:"
] | [
"TAGS\n#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us \n",
"## Ukrainian Formality Dataset (translated)\nWe obtained the first of its kind Ukrainian Formality Classification dataset by trainslating English GYAFC data.",
"## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n3. Additionally, the dataset was balanced.\n\nLabels: 0 - informal, 1 - formal.",
"## Load dataset:"
] | [
31,
39,
44,
6
] | [
"passage: TAGS\n#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us \n## Ukrainian Formality Dataset (translated)\nWe obtained the first of its kind Ukrainian Formality Classification dataset by trainslating English GYAFC data.## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n3. Additionally, the dataset was balanced.\n\nLabels: 0 - informal, 1 - formal.## Load dataset:"
] |
df3fbbddc858f50c94ae6060df70e9e6bf0d44a4 | ## Ukrainian NLI (translated)
We obtained the first of its kind Ukrainian Natural Language Inference Dataset by trainslating English NLI data.
## Dataset formation:
1. English data source: https://nlp.stanford.edu/projects/snli/
2. Translation into Ukrainian language using model: https://huggingface.co/facebook/nllb-200-distilled-600M
Labels: 0 - entailment, 1 - neutral, 2 - contradiction.
## Load dataset:
```
from datasets import load_dataset
dataset = load_dataset("ukr-detect/ukr-nli-dataset-translated-stanford")
``` | ukr-detect/ukr-nli-dataset-translated-stanford | [
"task_categories:text-classification",
"language:uk",
"license:openrail++",
"region:us"
] | 2024-02-14T11:55:31+00:00 | {"language": ["uk"], "license": "openrail++", "task_categories": ["text-classification"], "pretty_name": "ukr-nli", "dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "labels", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 107441017, "num_examples": 549361}, {"name": "validation", "num_bytes": 2029907, "num_examples": 9842}, {"name": "test", "num_bytes": 2025559, "num_examples": 9824}], "download_size": 27765800, "dataset_size": 111496483}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T05:50:12+00:00 | [] | [
"uk"
] | TAGS
#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us
| ## Ukrainian NLI (translated)
We obtained the first of its kind Ukrainian Natural Language Inference Dataset by trainslating English NLI data.
## Dataset formation:
1. English data source: URL
2. Translation into Ukrainian language using model: URL
Labels: 0 - entailment, 1 - neutral, 2 - contradiction.
## Load dataset:
| [
"## Ukrainian NLI (translated)\nWe obtained the first of its kind Ukrainian Natural Language Inference Dataset by trainslating English NLI data.",
"## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n\nLabels: 0 - entailment, 1 - neutral, 2 - contradiction.",
"## Load dataset:"
] | [
"TAGS\n#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us \n",
"## Ukrainian NLI (translated)\nWe obtained the first of its kind Ukrainian Natural Language Inference Dataset by trainslating English NLI data.",
"## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n\nLabels: 0 - entailment, 1 - neutral, 2 - contradiction.",
"## Load dataset:"
] | [
31,
37,
39,
6
] | [
"passage: TAGS\n#task_categories-text-classification #language-Ukrainian #license-openrail++ #region-us \n## Ukrainian NLI (translated)\nWe obtained the first of its kind Ukrainian Natural Language Inference Dataset by trainslating English NLI data.## Dataset formation:\n1. English data source: URL\n2. Translation into Ukrainian language using model: URL\n\nLabels: 0 - entailment, 1 - neutral, 2 - contradiction.## Load dataset:"
] |
e715997d2d228ab5264560b4ae5f6aac4d58b789 |
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b](https://huggingface.co/arlineka/Brunhilde-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arlineka__Brunhilde-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:54:25.541681](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b/blob/main/results_2024-02-14T11-54-25.541681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5588896471810725,
"acc_stderr": 0.03358524149192356,
"acc_norm": 0.5671325395608912,
"acc_norm_stderr": 0.034363791698055104,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436178,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375697,
"acc_norm": 0.8348934475204143,
"acc_norm_stderr": 0.0037051790292873302
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009787,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009787
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568973,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568973
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_arlineka__Brunhilde-13b | [
"region:us"
] | 2024-02-14T11:56:46+00:00 | {"pretty_name": "Evaluation run of arlineka/Brunhilde-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b](https://huggingface.co/arlineka/Brunhilde-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arlineka__Brunhilde-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T11:54:25.541681](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b/blob/main/results_2024-02-14T11-54-25.541681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588896471810725,\n \"acc_stderr\": 0.03358524149192356,\n \"acc_norm\": 0.5671325395608912,\n \"acc_norm_stderr\": 0.034363791698055104,\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436178,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n \"acc_stderr\": 0.004788412062375697,\n \"acc_norm\": 0.8348934475204143,\n \"acc_norm_stderr\": 0.0037051790292873302\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009787,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009787\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.7522349936143039,\n \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \"acc_stderr\": 0.007950942148339338\n }\n}\n```", "repo_url": "https://huggingface.co/arlineka/Brunhilde-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["**/details_harness|winogrande|5_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T11-54-25.541681.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T11_54_25.541681", "path": ["results_2024-02-14T11-54-25.541681.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T11-54-25.541681.parquet"]}]}]} | 2024-02-14T11:57:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b
Dataset automatically created during the evaluation run of model arlineka/Brunhilde-13b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T11:54:25.541681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of arlineka/Brunhilde-13b\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:54:25.541681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of arlineka/Brunhilde-13b\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:54:25.541681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of arlineka/Brunhilde-13b\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T11:54:25.541681(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
f2fc8767712f101a2523128ddb00b536d70396b1 |
# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KatyTheCutie/EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:59:41.203334](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B/blob/main/results_2024-02-14T11-59-41.203334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5588896471810725,
"acc_stderr": 0.03358524149192356,
"acc_norm": 0.5671325395608912,
"acc_norm_stderr": 0.034363791698055104,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436178,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375697,
"acc_norm": 0.8348934475204143,
"acc_norm_stderr": 0.0037051790292873302
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009787,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009787
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568973,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568973
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B | [
"region:us"
] | 2024-02-14T12:01:58+00:00 | {"pretty_name": "Evaluation run of KatyTheCutie/EstopianMaid-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KatyTheCutie/EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T11:59:41.203334](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B/blob/main/results_2024-02-14T11-59-41.203334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588896471810725,\n \"acc_stderr\": 0.03358524149192356,\n \"acc_norm\": 0.5671325395608912,\n \"acc_norm_stderr\": 0.034363791698055104,\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436178,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n \"acc_stderr\": 0.004788412062375697,\n \"acc_norm\": 0.8348934475204143,\n \"acc_norm_stderr\": 0.0037051790292873302\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009787,\n \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009787\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\": 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.7522349936143039,\n \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \"acc_stderr\": 0.007950942148339338\n }\n}\n```", "repo_url": "https://huggingface.co/KatyTheCutie/EstopianMaid-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["**/details_harness|winogrande|5_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T11-59-41.203334.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T11_59_41.203334", "path": ["results_2024-02-14T11-59-41.203334.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T11-59-41.203334.parquet"]}]}]} | 2024-02-14T12:02:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B
Dataset automatically created during the evaluation run of model KatyTheCutie/EstopianMaid-13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T11:59:41.203334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B\n\n\n\nDataset automatically created during the evaluation run of model KatyTheCutie/EstopianMaid-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:59:41.203334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B\n\n\n\nDataset automatically created during the evaluation run of model KatyTheCutie/EstopianMaid-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T11:59:41.203334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B\n\n\n\nDataset automatically created during the evaluation run of model KatyTheCutie/EstopianMaid-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T11:59:41.203334(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6a47fb20992f5da3687432bd67862fceb0597c68 | generated by ChatGPT | CreitinGameplays/elisa-chan-v1.5 | [
"language:en",
"region:us"
] | 2024-02-14T12:08:24+00:00 | {"language": ["en"]} | 2024-02-16T12:30:16+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
| generated by ChatGPT | [] | [
"TAGS\n#language-English #region-us \n"
] | [
10
] | [
"passage: TAGS\n#language-English #region-us \n"
] |
fb6011288b8714de330054b826e50d9295025aa9 |
Cancer-free CSV | Tom9000/wikitext-csv | [
"license:apache-2.0",
"region:us"
] | 2024-02-14T12:10:11+00:00 | {"license": "apache-2.0"} | 2024-02-14T12:16:25+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Cancer-free CSV | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
ada2a12c50f1fb2bec7d2e2c8a986ffd9c5f561d | # Persian Daily News
# Dataset Summary
persian_daily_news is a collection of 2 million of unique news articles with the headline for each article. dataset can be used in abstractive summarization and paraphrasing tasks.
This effort is part of a bigger perspective to have several datasets in Persian language(and other low resources languages) for different tasks that have two important factors: `free` and `easy-to-use`. Here is a quick HOW-TO for using this dataset in datasets library:[Demo-datasets](https://saied71.github.io/saied-alimoradi-blog/posts/2021-9-4-demo-datasets.html)
# Description
As discussed before, this dataset contains 2M news articles. Each article has these two attributes: text and summary. Here is a sample of dataset:
```
text: به گزارش گروه بین الملل ، خبرگزاری رسمی قطر اعلام کرد، بعد از امضای موافقتنامه همکاری نظامی بین قطر و روسیه این امکان فراهم شده است تا نظامیان قطری برای تکمیل آموزشهای نظامی خود عازم روسیه شده و در آنجا تعلیم ببینند.در چارچوب این قرارداد که امروز یک شنبه توسط سرتیپ ستاد عبدالعزیز صالح السلیطی رییس هییت همکاریهای بین المللی نظامی قطر و سرلشکر ویکتور جوریمیکین رییس اداره عمومی نیروی انسانی وزارت دفاع روسیه به امضا رسید، روابط نظامی بین دوحه و مسکو در زمینه موسسات آموزشهای نظامی شاهد توسه قابل توجهی خواهد شد.به نوشته این خبرگزاری روابط قطر و روسیه در حال گسترش بوده و به سوی شکلگیری مشارکت راهبردی در تمامی زمینهها پیش میرود.
summary: از این پس نظامیان قطری برای آموزش عازم روسیه شده و در موسسات آموزش نظامی این کشور تعلیم خواهند دید.
```
# Citation
```
[email protected]
title={persian_daily_news},
author={Saied Alimoradi},
year={2021}
}
```
| saied/persian_daily_news | [
"task_categories:summarization",
"task_categories:text-generation",
"source_datasets:original",
"language:fa",
"region:us"
] | 2024-02-14T12:23:51+00:00 | {"language": ["fa"], "source_datasets": ["original"], "task_categories": ["summarization", "text-generation"], "pretty_name": "Persian Daily News"} | 2024-02-14T12:29:21+00:00 | [] | [
"fa"
] | TAGS
#task_categories-summarization #task_categories-text-generation #source_datasets-original #language-Persian #region-us
| # Persian Daily News
# Dataset Summary
persian_daily_news is a collection of 2 million of unique news articles with the headline for each article. dataset can be used in abstractive summarization and paraphrasing tasks.
This effort is part of a bigger perspective to have several datasets in Persian language(and other low resources languages) for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets
# Description
As discussed before, this dataset contains 2M news articles. Each article has these two attributes: text and summary. Here is a sample of dataset:
| [
"# Persian Daily News",
"# Dataset Summary\n\npersian_daily_news is a collection of 2 million of unique news articles with the headline for each article. dataset can be used in abstractive summarization and paraphrasing tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language(and other low resources languages) for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 2M news articles. Each article has these two attributes: text and summary. Here is a sample of dataset:"
] | [
"TAGS\n#task_categories-summarization #task_categories-text-generation #source_datasets-original #language-Persian #region-us \n",
"# Persian Daily News",
"# Dataset Summary\n\npersian_daily_news is a collection of 2 million of unique news articles with the headline for each article. dataset can be used in abstractive summarization and paraphrasing tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language(and other low resources languages) for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets",
"# Description\n\nAs discussed before, this dataset contains 2M news articles. Each article has these two attributes: text and summary. Here is a sample of dataset:"
] | [
40,
5,
122,
37
] | [
"passage: TAGS\n#task_categories-summarization #task_categories-text-generation #source_datasets-original #language-Persian #region-us \n# Persian Daily News# Dataset Summary\n\npersian_daily_news is a collection of 2 million of unique news articles with the headline for each article. dataset can be used in abstractive summarization and paraphrasing tasks.\n\nThis effort is part of a bigger perspective to have several datasets in Persian language(and other low resources languages) for different tasks that have two important factors: 'free' and 'easy-to-use'. Here is a quick HOW-TO for using this dataset in datasets library:Demo-datasets# Description\n\nAs discussed before, this dataset contains 2M news articles. Each article has these two attributes: text and summary. Here is a sample of dataset:"
] |
55acb5be6ff59ee9041a86c2a9aff17003296016 |
# Dataset Card for Evaluation run of yleo/EmertonBeagle-7B-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/EmertonBeagle-7B-dpo](https://huggingface.co/yleo/EmertonBeagle-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__EmertonBeagle-7B-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T12:29:04.356881](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonBeagle-7B-dpo/blob/main/results_2024-02-14T12-29-04.356881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6502849479955121,
"acc_stderr": 0.032124695800127875,
"acc_norm": 0.650291607002714,
"acc_norm_stderr": 0.03278751705025616,
"mc1": 0.598531211750306,
"mc1_stderr": 0.01716027390169366,
"mc2": 0.7595578654539383,
"mc2_stderr": 0.013995290002307544
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.013329750293382316,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.004507768029590099,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521083
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250437,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.016469814928406167,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.016469814928406167
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823694,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823694
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.598531211750306,
"mc1_stderr": 0.01716027390169366,
"mc2": 0.7595578654539383,
"mc2_stderr": 0.013995290002307544
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.01041084977522279
},
"harness|gsm8k|5": {
"acc": 0.6641394996209249,
"acc_stderr": 0.013009224714267357
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__EmertonBeagle-7B-dpo | [
"region:us"
] | 2024-02-14T12:31:20+00:00 | {"pretty_name": "Evaluation run of yleo/EmertonBeagle-7B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/EmertonBeagle-7B-dpo](https://huggingface.co/yleo/EmertonBeagle-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__EmertonBeagle-7B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T12:29:04.356881](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonBeagle-7B-dpo/blob/main/results_2024-02-14T12-29-04.356881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502849479955121,\n \"acc_stderr\": 0.032124695800127875,\n \"acc_norm\": 0.650291607002714,\n \"acc_norm_stderr\": 0.03278751705025616,\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.01716027390169366,\n \"mc2\": 0.7595578654539383,\n \"mc2_stderr\": 0.013995290002307544\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.013329750293382316,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n \"acc_stderr\": 0.004507768029590099,\n \"acc_norm\": 0.8911571400119498,\n \"acc_norm_stderr\": 0.0031080545633521083\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250437,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n \"acc_stderr\": 0.016469814928406167,\n \"acc_norm\": 0.4134078212290503,\n \"acc_norm_stderr\": 0.016469814928406167\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n \"acc_stderr\": 0.012756933382823694,\n \"acc_norm\": 0.4771838331160365,\n \"acc_norm_stderr\": 0.012756933382823694\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.01716027390169366,\n \"mc2\": 0.7595578654539383,\n \"mc2_stderr\": 0.013995290002307544\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.01041084977522279\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6641394996209249,\n \"acc_stderr\": 0.013009224714267357\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/EmertonBeagle-7B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|arc:challenge|25_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|gsm8k|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hellaswag|10_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T12-29-04.356881.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["**/details_harness|winogrande|5_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T12-29-04.356881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T12_29_04.356881", "path": ["results_2024-02-14T12-29-04.356881.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T12-29-04.356881.parquet"]}]}]} | 2024-02-14T12:31:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/EmertonBeagle-7B-dpo
Dataset automatically created during the evaluation run of model yleo/EmertonBeagle-7B-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T12:29:04.356881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/EmertonBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T12:29:04.356881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/EmertonBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T12:29:04.356881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yleo/EmertonBeagle-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonBeagle-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T12:29:04.356881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
a6757f2ab599ca49808f1c814defde33ceb8b8d2 |
# Dataset Card for Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO](https://huggingface.co/NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T12:51:27.185678](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO/blob/main/results_2024-02-14T12-51-27.185678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6108305478672853,
"acc_stderr": 0.032907750893401165,
"acc_norm": 0.6160620477308243,
"acc_norm_stderr": 0.03358458368088749,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.4815580541557307,
"mc2_stderr": 0.014472367088943533
},
"harness|arc:challenge|25": {
"acc": 0.3796928327645051,
"acc_stderr": 0.014182119866974876,
"acc_norm": 0.39419795221843,
"acc_norm_stderr": 0.014280522667467323
},
"harness|hellaswag|10": {
"acc": 0.6296554471220872,
"acc_stderr": 0.00481910045686781,
"acc_norm": 0.8258315076677952,
"acc_norm_stderr": 0.0037847921724660683
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200148,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200148
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876163,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4145251396648045,
"acc_stderr": 0.016476342210253996,
"acc_norm": 0.4145251396648045,
"acc_norm_stderr": 0.016476342210253996
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790206,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440313,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440313
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.4815580541557307,
"mc2_stderr": 0.014472367088943533
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.0117930158176636
},
"harness|gsm8k|5": {
"acc": 0.35178165276724793,
"acc_stderr": 0.013153446023536035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NovoCode__Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO | [
"region:us"
] | 2024-02-14T12:53:44+00:00 | {"pretty_name": "Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO](https://huggingface.co/NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T12:51:27.185678](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO/blob/main/results_2024-02-14T12-51-27.185678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6108305478672853,\n \"acc_stderr\": 0.032907750893401165,\n \"acc_norm\": 0.6160620477308243,\n \"acc_norm_stderr\": 0.03358458368088749,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.4815580541557307,\n \"mc2_stderr\": 0.014472367088943533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3796928327645051,\n \"acc_stderr\": 0.014182119866974876,\n \"acc_norm\": 0.39419795221843,\n \"acc_norm_stderr\": 0.014280522667467323\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6296554471220872,\n \"acc_stderr\": 0.00481910045686781,\n \"acc_norm\": 0.8258315076677952,\n \"acc_norm_stderr\": 0.0037847921724660683\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200148,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200148\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4145251396648045,\n \"acc_stderr\": 0.016476342210253996,\n \"acc_norm\": 0.4145251396648045,\n \"acc_norm_stderr\": 0.016476342210253996\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n \"acc_stderr\": 0.012643004623790206,\n \"acc_norm\": 0.42959582790091266,\n \"acc_norm_stderr\": 0.012643004623790206\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440313,\n \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440313\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.4815580541557307,\n \"mc2_stderr\": 0.014472367088943533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.0117930158176636\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35178165276724793,\n \"acc_stderr\": 0.013153446023536035\n }\n}\n```", "repo_url": "https://huggingface.co/NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|arc:challenge|25_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|gsm8k|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hellaswag|10_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T12-51-27.185678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["**/details_harness|winogrande|5_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T12-51-27.185678.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T12_51_27.185678", "path": ["results_2024-02-14T12-51-27.185678.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T12-51-27.185678.parquet"]}]}]} | 2024-02-14T12:54:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO
Dataset automatically created during the evaluation run of model NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T12:51:27.185678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO\n\n\n\nDataset automatically created during the evaluation run of model NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T12:51:27.185678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO\n\n\n\nDataset automatically created during the evaluation run of model NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T12:51:27.185678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
213,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO\n\n\n\nDataset automatically created during the evaluation run of model NovoCode/Tiger-7B-v0.1-LaserRMT-Math-5-10-15-Neural-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T12:51:27.185678(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
79ec83c3f9e2fbfa8c3881845d80bffbbdbd4251 |
# **SINHALA QUESTION AND ANSWER DATASET by Indramal**
> **Contact details:** [Indramal Wansekara Profile Website](https://www.indramal.com/) | Indramal/SINHALA_QUESTION_AND_ANSWER_DATASET | [
"task_categories:question-answering",
"language:si",
"license:apache-2.0",
"sinhala",
"general",
"question&answering",
"region:us"
] | 2024-02-14T13:06:39+00:00 | {"language": ["si"], "license": "apache-2.0", "task_categories": ["question-answering"], "tags": ["sinhala", "general", "question&answering"]} | 2024-02-14T14:10:18+00:00 | [] | [
"si"
] | TAGS
#task_categories-question-answering #language-Sinhala #license-apache-2.0 #sinhala #general #question&answering #region-us
|
# SINHALA QUESTION AND ANSWER DATASET by Indramal
> Contact details: Indramal Wansekara Profile Website | [
"# SINHALA QUESTION AND ANSWER DATASET by Indramal\n\n> Contact details: Indramal Wansekara Profile Website"
] | [
"TAGS\n#task_categories-question-answering #language-Sinhala #license-apache-2.0 #sinhala #general #question&answering #region-us \n",
"# SINHALA QUESTION AND ANSWER DATASET by Indramal\n\n> Contact details: Indramal Wansekara Profile Website"
] | [
43,
28
] | [
"passage: TAGS\n#task_categories-question-answering #language-Sinhala #license-apache-2.0 #sinhala #general #question&answering #region-us \n# SINHALA QUESTION AND ANSWER DATASET by Indramal\n\n> Contact details: Indramal Wansekara Profile Website"
] |
0bdc91018d131fc4fcf746a2ee190db90ffb03e0 |
# Dataset Card for Dataset Name
This dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.
## Dataset Details
### Dataset Description
The dataset comprises a comprehensive selection of topics, including but not limited to:
Frequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.
Inquiries pertaining to placements, encompassing strategies, tips, and common queries.
Questions related to fundamental concepts in Data Structures and Algorithms.
Queries and discussions regarding research papers, methodologies, and academic pursuits.
- **Curated by:** AI Team- IEEE SB VIT Pune
## Uses
This data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot
## Dataset Structure
The dataset consists of the following fields:
- **Instruction:** This field represents the prompt or query posed to the chatbot.
- **Response:** This field contains the corresponding generated response by the chatbot.
## Dataset Structure Information
The dataset is structured in a JSON format, with each entry containing the following fields:
```json
{
"instruction": "What is IEEE?",
"response": "The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity."
}
```
### Curation Rationale
The motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.
At its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.
## Dataset Card Authors
<br>AI Team- IEEE SB VIT Pune
<br>Mrunmayee Phadke (Project Head)
<br>Hritesh Maikap
<br>Nidhish
<br>Arya Lokhande
<br>Apurva Kota
<br>Soham Nimale
| IEEEVITPune-AI-Team/chatbotAlpha | [
"region:us"
] | 2024-02-14T13:11:31+00:00 | {"dataset_info": {"features": [{"name": "Question", "dtype": "string"}, {"name": "Answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2101813, "num_examples": 5526}], "download_size": 821355, "dataset_size": 2101813}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T16:01:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.
## Dataset Details
### Dataset Description
The dataset comprises a comprehensive selection of topics, including but not limited to:
Frequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.
Inquiries pertaining to placements, encompassing strategies, tips, and common queries.
Questions related to fundamental concepts in Data Structures and Algorithms.
Queries and discussions regarding research papers, methodologies, and academic pursuits.
- Curated by: AI Team- IEEE SB VIT Pune
## Uses
This data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot
## Dataset Structure
The dataset consists of the following fields:
- Instruction: This field represents the prompt or query posed to the chatbot.
- Response: This field contains the corresponding generated response by the chatbot.
## Dataset Structure Information
The dataset is structured in a JSON format, with each entry containing the following fields:
### Curation Rationale
The motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.
At its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.
## Dataset Card Authors
<br>AI Team- IEEE SB VIT Pune
<br>Mrunmayee Phadke (Project Head)
<br>Hritesh Maikap
<br>Nidhish
<br>Arya Lokhande
<br>Apurva Kota
<br>Soham Nimale
| [
"# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.",
"## Dataset Details",
"### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune",
"## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot",
"## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.",
"## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:",
"### Curation Rationale\n\nThe motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.\nAt its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.",
"## Dataset Card Authors\n<br>AI Team- IEEE SB VIT Pune\n<br>Mrunmayee Phadke (Project Head)\n<br>Hritesh Maikap\n<br>Nidhish\n<br>Arya Lokhande\n<br>Apurva Kota\n<br>Soham Nimale"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.",
"## Dataset Details",
"### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune",
"## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot",
"## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.",
"## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:",
"### Curation Rationale\n\nThe motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.\nAt its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.",
"## Dataset Card Authors\n<br>AI Team- IEEE SB VIT Pune\n<br>Mrunmayee Phadke (Project Head)\n<br>Hritesh Maikap\n<br>Nidhish\n<br>Arya Lokhande\n<br>Apurva Kota\n<br>Soham Nimale"
] | [
6,
76,
4,
123,
69,
54,
29,
195,
68
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name\n\nThis dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.## Dataset Details### Dataset Description\nThe dataset comprises a comprehensive selection of topics, including but not limited to:\n\nFrequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.\nInquiries pertaining to placements, encompassing strategies, tips, and common queries.\nQuestions related to fundamental concepts in Data Structures and Algorithms.\nQueries and discussions regarding research papers, methodologies, and academic pursuits.\n\n- Curated by: AI Team- IEEE SB VIT Pune## Uses\n\nThis data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot## Dataset Structure\n\nThe dataset consists of the following fields:\n\n- Instruction: This field represents the prompt or query posed to the chatbot.\n- Response: This field contains the corresponding generated response by the chatbot.## Dataset Structure Information\n\nThe dataset is structured in a JSON format, with each entry containing the following fields:"
] |
e62b800813e65822810f5729523726c4bd8b07e4 |
# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/stack_codellama-7b-inst](https://huggingface.co/tyson0420/stack_codellama-7b-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T13:18:15.759578](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst/blob/main/results_2024-02-14T13-18-15.759578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39779725857115605,
"acc_stderr": 0.034197794443939396,
"acc_norm": 0.4011078418565935,
"acc_norm_stderr": 0.03495973121827962,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.39026734427166393,
"mc2_stderr": 0.01459951299615118
},
"harness|arc:challenge|25": {
"acc": 0.39078498293515357,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.4351535836177474,
"acc_norm_stderr": 0.014487986197186041
},
"harness|hellaswag|10": {
"acc": 0.49123680541724757,
"acc_stderr": 0.004989014986235631,
"acc_norm": 0.6617207727544314,
"acc_norm_stderr": 0.0047215714433544095
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.41935483870967744,
"acc_stderr": 0.02807158890109185,
"acc_norm": 0.41935483870967744,
"acc_norm_stderr": 0.02807158890109185
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937533,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937533
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.46632124352331605,
"acc_stderr": 0.03600244069867178,
"acc_norm": 0.46632124352331605,
"acc_norm_stderr": 0.03600244069867178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.02385479568097114,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.02385479568097114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5247706422018349,
"acc_stderr": 0.021410999753635914,
"acc_norm": 0.5247706422018349,
"acc_norm_stderr": 0.021410999753635914
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.031546962856566295,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.031546962856566295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.035050931943487976,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.035050931943487976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.38650306748466257,
"acc_stderr": 0.038258255488486076,
"acc_norm": 0.38650306748466257,
"acc_norm_stderr": 0.038258255488486076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.03035152732334494,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.03035152732334494
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5261813537675607,
"acc_stderr": 0.01785543455404199,
"acc_norm": 0.5261813537675607,
"acc_norm_stderr": 0.01785543455404199
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.026483392042098187,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.026483392042098187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.01461446582196634,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.01461446582196634
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562786,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562786
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4506172839506173,
"acc_stderr": 0.0276847214156562,
"acc_norm": 0.4506172839506173,
"acc_norm_stderr": 0.0276847214156562
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534785,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32529335071707954,
"acc_stderr": 0.011965311536571528,
"acc_norm": 0.32529335071707954,
"acc_norm_stderr": 0.011965311536571528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.44776119402985076,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.44776119402985076,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.038342347441649924,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.038342347441649924
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752323,
"mc2": 0.39026734427166393,
"mc2_stderr": 0.01459951299615118
},
"harness|winogrande|5": {
"acc": 0.6566692975532754,
"acc_stderr": 0.013344823185357998
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238966
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst | [
"region:us"
] | 2024-02-14T13:20:35+00:00 | {"pretty_name": "Evaluation run of tyson0420/stack_codellama-7b-inst", "dataset_summary": "Dataset automatically created during the evaluation run of model [tyson0420/stack_codellama-7b-inst](https://huggingface.co/tyson0420/stack_codellama-7b-inst) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T13:18:15.759578](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_codellama-7b-inst/blob/main/results_2024-02-14T13-18-15.759578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39779725857115605,\n \"acc_stderr\": 0.034197794443939396,\n \"acc_norm\": 0.4011078418565935,\n \"acc_norm_stderr\": 0.03495973121827962,\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752323,\n \"mc2\": 0.39026734427166393,\n \"mc2_stderr\": 0.01459951299615118\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39078498293515357,\n \"acc_stderr\": 0.01425856388051378,\n \"acc_norm\": 0.4351535836177474,\n \"acc_norm_stderr\": 0.014487986197186041\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49123680541724757,\n \"acc_stderr\": 0.004989014986235631,\n \"acc_norm\": 0.6617207727544314,\n \"acc_norm_stderr\": 0.0047215714433544095\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.41935483870967744,\n \"acc_stderr\": 0.02807158890109185,\n \"acc_norm\": 0.41935483870967744,\n \"acc_norm_stderr\": 0.02807158890109185\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937533,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937533\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431857,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431857\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03560716516531061,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03560716516531061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.02385479568097114,\n \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.02385479568097114\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5247706422018349,\n \"acc_stderr\": 0.021410999753635914,\n \"acc_norm\": 0.5247706422018349,\n \"acc_norm_stderr\": 0.021410999753635914\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.031546962856566295,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.031546962856566295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.035050931943487976,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.035050931943487976\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693334,\n \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693334\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.38650306748466257,\n \"acc_stderr\": 0.038258255488486076,\n \"acc_norm\": 0.38650306748466257,\n \"acc_norm_stderr\": 0.038258255488486076\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.03035152732334494,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.03035152732334494\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5261813537675607,\n \"acc_stderr\": 0.01785543455404199,\n \"acc_norm\": 0.5261813537675607,\n \"acc_norm_stderr\": 0.01785543455404199\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n \"acc_stderr\": 0.026483392042098187,\n \"acc_norm\": 0.41040462427745666,\n \"acc_norm_stderr\": 0.026483392042098187\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.01461446582196634,\n \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.01461446582196634\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.02824513402438729,\n \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.02824513402438729\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n \"acc_stderr\": 0.028333277109562786,\n \"acc_norm\": 0.4662379421221865,\n \"acc_norm_stderr\": 0.028333277109562786\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.0276847214156562,\n \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.0276847214156562\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534785,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534785\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32529335071707954,\n \"acc_stderr\": 0.011965311536571528,\n \"acc_norm\": 0.32529335071707954,\n \"acc_norm_stderr\": 0.011965311536571528\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.37745098039215685,\n \"acc_stderr\": 0.019610851474880286,\n \"acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.019610851474880286\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.44776119402985076,\n \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.44776119402985076,\n \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.038342347441649924,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.038342347441649924\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752323,\n \"mc2\": 0.39026734427166393,\n \"mc2_stderr\": 0.01459951299615118\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6566692975532754,\n \"acc_stderr\": 0.013344823185357998\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \"acc_stderr\": 0.010058474790238966\n }\n}\n```", "repo_url": "https://huggingface.co/tyson0420/stack_codellama-7b-inst", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|arc:challenge|25_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|gsm8k|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hellaswag|10_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["**/details_harness|winogrande|5_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T13-18-15.759578.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T13_18_15.759578", "path": ["results_2024-02-14T13-18-15.759578.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T13-18-15.759578.parquet"]}]}]} | 2024-02-14T13:21:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst
Dataset automatically created during the evaluation run of model tyson0420/stack_codellama-7b-inst on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T13:18:15.759578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_codellama-7b-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T13:18:15.759578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_codellama-7b-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T13:18:15.759578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of tyson0420/stack_codellama-7b-inst\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_codellama-7b-inst on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T13:18:15.759578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
d31d243dd63eba1fbf47e3a14c07e929f50e7750 | # Dataset Card for "pretrain_wiki"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cfli/pretrain_wiki | [
"region:us"
] | 2024-02-14T13:23:06+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output_summarize", "dtype": "string"}, {"name": "output_predict", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 49155157522, "num_examples": 10396029}], "download_size": 28572891492, "dataset_size": 49155157522}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T15:14:44+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pretrain_wiki"
More Information needed | [
"# Dataset Card for \"pretrain_wiki\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pretrain_wiki\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pretrain_wiki\"\n\nMore Information needed"
] |
ef2b50e5e5f3d5058baed227e2ad5de6ad223c64 | TO BE UPDATED.
# ViNoM Dataset
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
<!--
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]-->
- **License: This dataset is derived from VAW and VisualGnome. Please see their licenses at (https://github.com/adobe-research/vaw_dataset/blob/main/LICENSE.md) and (https://homes.cs.washington.edu/~ranjay/visualgenome/index.html)
<!--
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
## Dataset Creation
## Citation [optional]
TBA
**BibTeX:**
TBA
| voroujak/ViNoM | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-14T13:28:55+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-14T18:06:56+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| TO BE UPDATED.
# ViNoM Dataset
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- License: This dataset is derived from VAW and VisualGnome. Please see their licenses at (URL and (URL
- Repository:
- Paper [optional]:
- Demo [optional]:
## Dataset Structure
## Dataset Creation
[optional]
TBA
BibTeX:
TBA
| [
"# ViNoM Dataset \n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- License: This dataset is derived from VAW and VisualGnome. Please see their licenses at (URL and (URL\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Dataset Structure",
"## Dataset Creation\n\n\n[optional]\n\nTBA\n\nBibTeX:\n\nTBA"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"# ViNoM Dataset \n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- License: This dataset is derived from VAW and VisualGnome. Please see their licenses at (URL and (URL\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Dataset Structure",
"## Dataset Creation\n\n\n[optional]\n\nTBA\n\nBibTeX:\n\nTBA"
] | [
17,
32,
4,
52,
6,
18
] | [
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n# ViNoM Dataset \n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- License: This dataset is derived from VAW and VisualGnome. Please see their licenses at (URL and (URL\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Dataset Structure## Dataset Creation\n\n\n[optional]\n\nTBA\n\nBibTeX:\n\nTBA"
] |
2f870f1b55e24515d816c76371e8183372c6d8c1 | Source: https://huggingface.co/datasets/songlab/multiz100way
```89.zarr.tar.gz``` is created with ```pigz``` with option ```-1``` for fastest decompression.
Install ```pigz``` if not already installed:
```shell
sudo apt install pigz
```
Download and extract:
```shell
wget https://huggingface.co/datasets/lpigou/89.zarr/resolve/main/89.zarr.tar.gz
unpigz < 89.zarr.tar.gz | tar -x
``` | lpigou/89.zarr | [
"license:mit",
"region:us"
] | 2024-02-14T13:39:35+00:00 | {"license": "mit"} | 2024-02-15T08:03:09+00:00 | [] | [] | TAGS
#license-mit #region-us
| Source: URL
is created with with option for fastest decompression.
Install if not already installed:
Download and extract:
| [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] |
08a8d890fefe2360413c64aa8c8cd6822ef0e4ea | # Draft conversion of EdAcc
Final dataset will be moved to the edinburghcstr organisation. | sanchit-gandhi/edacc | [
"region:us"
] | 2024-02-14T13:43:32+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "accent", "dtype": "string"}, {"name": "raw_accent", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "audio", "dtype": "audio"}], "splits": [{"name": "validation", "num_bytes": 2615426357.928, "num_examples": 9848}, {"name": "test", "num_bytes": 4926406074.438, "num_examples": 9289}], "download_size": 6951142950, "dataset_size": 7541832432.365999}} | 2024-02-15T17:27:45+00:00 | [] | [] | TAGS
#region-us
| # Draft conversion of EdAcc
Final dataset will be moved to the edinburghcstr organisation. | [
"# Draft conversion of EdAcc\n\nFinal dataset will be moved to the edinburghcstr organisation."
] | [
"TAGS\n#region-us \n",
"# Draft conversion of EdAcc\n\nFinal dataset will be moved to the edinburghcstr organisation."
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Draft conversion of EdAcc\n\nFinal dataset will be moved to the edinburghcstr organisation."
] |
6f9674740dbfc65b2103927c28412aad3934a8a0 |
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T13:43:16.252848](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3/blob/main/results_2024-02-14T13-43-16.252848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6494440624014682,
"acc_stderr": 0.032086553295201554,
"acc_norm": 0.6485244990130871,
"acc_norm_stderr": 0.032761229277755544,
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7826940259074282,
"mc2_stderr": 0.013701443041279172
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653884,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787942
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523367,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7826940259074282,
"mc2_stderr": 0.013701443041279172
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.01009920824606559
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3 | [
"region:us"
] | 2024-02-14T13:45:34+00:00 | {"pretty_name": "Evaluation run of bardsai/jaskier-7b-dpo-v4.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T13:43:16.252848](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3/blob/main/results_2024-02-14T13-43-16.252848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6494440624014682,\n \"acc_stderr\": 0.032086553295201554,\n \"acc_norm\": 0.6485244990130871,\n \"acc_norm_stderr\": 0.032761229277755544,\n \"mc1\": 0.6340269277845777,\n \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7826940259074282,\n \"mc2_stderr\": 0.013701443041279172\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653884,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8908583947420833,\n \"acc_norm_stderr\": 0.003111795320787942\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n \"acc_stderr\": 0.012756161942523367,\n \"acc_norm\": 0.4765319426336376,\n \"acc_norm_stderr\": 0.012756161942523367\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6340269277845777,\n \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7826940259074282,\n \"mc2_stderr\": 0.013701443041279172\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.01009920824606559\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078134\n }\n}\n```", "repo_url": "https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|arc:challenge|25_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|gsm8k|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hellaswag|10_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["**/details_harness|winogrande|5_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T13-43-16.252848.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T13_43_16.252848", "path": ["results_2024-02-14T13-43-16.252848.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T13-43-16.252848.parquet"]}]}]} | 2024-02-14T13:45:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3
Dataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T13:43:16.252848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T13:43:16.252848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T13:43:16.252848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3\n\n\n\nDataset automatically created during the evaluation run of model bardsai/jaskier-7b-dpo-v4.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T13:43:16.252848(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b896bc57e3a1fdde2febcc9a5d25430af8d50fd1 | # Dataset Card for "ultrachat_200k_filtered_1707919115"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrachat_200k_filtered_1707919115 | [
"region:us"
] | 2024-02-14T13:58:52+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T13:58:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrachat_200k_filtered_1707919115"
More Information needed | [
"# Dataset Card for \"ultrachat_200k_filtered_1707919115\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrachat_200k_filtered_1707919115\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ultrachat_200k_filtered_1707919115\"\n\nMore Information needed"
] |
1a8632b0db88f6ed2a4064adedb9a76386992494 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707919193 | [
"region:us"
] | 2024-02-14T14:00:10+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:00:16+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
64740f300658905c2338d503b258b53c52751e07 |
# Synthetic TTS Dataset
## Overview
This dataset was created with the aim of exploring the concept of using synthetic datasets for training Text-to-Speech (TTS) models. It consists of 1,388 audio files with a total duration of 2 hours and 20 minutes and their corresponding textual transcripts. The dataset leverages the capabilities of advanced AI services, utilizing paid subscriptions to ChatGPT-4 for text generation and ElevenLabs.io for audio generation.
## Dataset Composition
- **Audio Files**: 1,388 files
- **Total Duration**: 2 hours and 20 minutes
- **Text Transcripts**: Corresponding texts for each audio file
## Purpose
The primary goal of this dataset is to provide a resource for testing and developing TTS models, particularly to evaluate the effectiveness of synthetic datasets in training such models.
## Usage
This dataset is distributed "as is" under the MIT License, making it freely available for educational, research, and commercial purposes, with proper attribution required.
## Model Training
A pflow model has been trained using this dataset, showcasing its potential for TTS applications:
- **Model Checkpoints**: [Hugging Face - pyflowtts_uk_elevenlabs](https://huggingface.co/skypro1111/pyflowtts_uk_elevenlabs)
- **Codebase**: [GitHub - skypro1111/pflowtts_pytorch_uk](https://github.com/skypro1111/pflowtts_pytorch_uk)
## License
This dataset is made available under the MIT License. See the LICENSE file in this repository for more details.
## Citation
If you use this dataset in your research or project, please cite it as follows:
```
@misc{synthetic_tts_dataset,
author = {@skypro1111},
title = {Synthetic TTS Dataset for Training Models},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/skypro1111/pflowtts_pytorch_uk}}
}
```
## Acknowledgments
Created by [@skypro1111](https://github.com/skypro1111), this dataset was made possible through the use of ChatGPT-4 by OpenAI and ElevenLabs.io's audio generation capabilities.
---
© 2024 @skypro1111. All Rights Reserved.
| skypro1111/elevenlabs_dataset | [
"task_categories:text-to-speech",
"language:uk",
"license:mit",
"region:us"
] | 2024-02-14T14:03:12+00:00 | {"language": ["uk"], "license": "mit", "task_categories": ["text-to-speech"]} | 2024-02-14T14:28:58+00:00 | [] | [
"uk"
] | TAGS
#task_categories-text-to-speech #language-Ukrainian #license-mit #region-us
|
# Synthetic TTS Dataset
## Overview
This dataset was created with the aim of exploring the concept of using synthetic datasets for training Text-to-Speech (TTS) models. It consists of 1,388 audio files with a total duration of 2 hours and 20 minutes and their corresponding textual transcripts. The dataset leverages the capabilities of advanced AI services, utilizing paid subscriptions to ChatGPT-4 for text generation and URL for audio generation.
## Dataset Composition
- Audio Files: 1,388 files
- Total Duration: 2 hours and 20 minutes
- Text Transcripts: Corresponding texts for each audio file
## Purpose
The primary goal of this dataset is to provide a resource for testing and developing TTS models, particularly to evaluate the effectiveness of synthetic datasets in training such models.
## Usage
This dataset is distributed "as is" under the MIT License, making it freely available for educational, research, and commercial purposes, with proper attribution required.
## Model Training
A pflow model has been trained using this dataset, showcasing its potential for TTS applications:
- Model Checkpoints: Hugging Face - pyflowtts_uk_elevenlabs
- Codebase: GitHub - skypro1111/pflowtts_pytorch_uk
## License
This dataset is made available under the MIT License. See the LICENSE file in this repository for more details.
If you use this dataset in your research or project, please cite it as follows:
## Acknowledgments
Created by @skypro1111, this dataset was made possible through the use of ChatGPT-4 by OpenAI and URL's audio generation capabilities.
---
© 2024 @skypro1111. All Rights Reserved.
| [
"# Synthetic TTS Dataset",
"## Overview\n\nThis dataset was created with the aim of exploring the concept of using synthetic datasets for training Text-to-Speech (TTS) models. It consists of 1,388 audio files with a total duration of 2 hours and 20 minutes and their corresponding textual transcripts. The dataset leverages the capabilities of advanced AI services, utilizing paid subscriptions to ChatGPT-4 for text generation and URL for audio generation.",
"## Dataset Composition\n\n- Audio Files: 1,388 files\n- Total Duration: 2 hours and 20 minutes\n- Text Transcripts: Corresponding texts for each audio file",
"## Purpose\n\nThe primary goal of this dataset is to provide a resource for testing and developing TTS models, particularly to evaluate the effectiveness of synthetic datasets in training such models.",
"## Usage\n\nThis dataset is distributed \"as is\" under the MIT License, making it freely available for educational, research, and commercial purposes, with proper attribution required.",
"## Model Training\n\nA pflow model has been trained using this dataset, showcasing its potential for TTS applications:\n\n- Model Checkpoints: Hugging Face - pyflowtts_uk_elevenlabs\n- Codebase: GitHub - skypro1111/pflowtts_pytorch_uk",
"## License\n\nThis dataset is made available under the MIT License. See the LICENSE file in this repository for more details.\n\nIf you use this dataset in your research or project, please cite it as follows:",
"## Acknowledgments\n\nCreated by @skypro1111, this dataset was made possible through the use of ChatGPT-4 by OpenAI and URL's audio generation capabilities.\n\n---\n\n© 2024 @skypro1111. All Rights Reserved."
] | [
"TAGS\n#task_categories-text-to-speech #language-Ukrainian #license-mit #region-us \n",
"# Synthetic TTS Dataset",
"## Overview\n\nThis dataset was created with the aim of exploring the concept of using synthetic datasets for training Text-to-Speech (TTS) models. It consists of 1,388 audio files with a total duration of 2 hours and 20 minutes and their corresponding textual transcripts. The dataset leverages the capabilities of advanced AI services, utilizing paid subscriptions to ChatGPT-4 for text generation and URL for audio generation.",
"## Dataset Composition\n\n- Audio Files: 1,388 files\n- Total Duration: 2 hours and 20 minutes\n- Text Transcripts: Corresponding texts for each audio file",
"## Purpose\n\nThe primary goal of this dataset is to provide a resource for testing and developing TTS models, particularly to evaluate the effectiveness of synthetic datasets in training such models.",
"## Usage\n\nThis dataset is distributed \"as is\" under the MIT License, making it freely available for educational, research, and commercial purposes, with proper attribution required.",
"## Model Training\n\nA pflow model has been trained using this dataset, showcasing its potential for TTS applications:\n\n- Model Checkpoints: Hugging Face - pyflowtts_uk_elevenlabs\n- Codebase: GitHub - skypro1111/pflowtts_pytorch_uk",
"## License\n\nThis dataset is made available under the MIT License. See the LICENSE file in this repository for more details.\n\nIf you use this dataset in your research or project, please cite it as follows:",
"## Acknowledgments\n\nCreated by @skypro1111, this dataset was made possible through the use of ChatGPT-4 by OpenAI and URL's audio generation capabilities.\n\n---\n\n© 2024 @skypro1111. All Rights Reserved."
] | [
31,
8,
102,
37,
42,
39,
69,
47,
53
] | [
"passage: TAGS\n#task_categories-text-to-speech #language-Ukrainian #license-mit #region-us \n# Synthetic TTS Dataset## Overview\n\nThis dataset was created with the aim of exploring the concept of using synthetic datasets for training Text-to-Speech (TTS) models. It consists of 1,388 audio files with a total duration of 2 hours and 20 minutes and their corresponding textual transcripts. The dataset leverages the capabilities of advanced AI services, utilizing paid subscriptions to ChatGPT-4 for text generation and URL for audio generation.## Dataset Composition\n\n- Audio Files: 1,388 files\n- Total Duration: 2 hours and 20 minutes\n- Text Transcripts: Corresponding texts for each audio file## Purpose\n\nThe primary goal of this dataset is to provide a resource for testing and developing TTS models, particularly to evaluate the effectiveness of synthetic datasets in training such models.## Usage\n\nThis dataset is distributed \"as is\" under the MIT License, making it freely available for educational, research, and commercial purposes, with proper attribution required.## Model Training\n\nA pflow model has been trained using this dataset, showcasing its potential for TTS applications:\n\n- Model Checkpoints: Hugging Face - pyflowtts_uk_elevenlabs\n- Codebase: GitHub - skypro1111/pflowtts_pytorch_uk## License\n\nThis dataset is made available under the MIT License. See the LICENSE file in this repository for more details.\n\nIf you use this dataset in your research or project, please cite it as follows:## Acknowledgments\n\nCreated by @skypro1111, this dataset was made possible through the use of ChatGPT-4 by OpenAI and URL's audio generation capabilities.\n\n---\n\n© 2024 @skypro1111. All Rights Reserved."
] |
4e7c4b0a8c13c4f1a62846d2c6f3b2f1b636522e | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707919460 | [
"region:us"
] | 2024-02-14T14:04:37+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:04:44+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
75c4b98bd26bc0d2e6032547715fe8891f9d2416 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707919621 | [
"region:us"
] | 2024-02-14T14:07:18+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:07:22+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
8b33ecd500a37cc373d23cce6b5396289977ddda | # Dataset Card for "ultrafeedback_binarized_1707919621"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1707919621 | [
"region:us"
] | 2024-02-14T14:07:28+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 16973857, "num_examples": 1000}, {"name": "train_prefs", "num_bytes": 16589732, "num_examples": 1000}], "download_size": 0, "dataset_size": 33563589}} | 2024-02-14T14:08:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1707919621"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1707919621\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1707919621\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ultrafeedback_binarized_1707919621\"\n\nMore Information needed"
] |
7acd9667ced74ff1b7b4712f2835a709e30d369f | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707920039 | [
"region:us"
] | 2024-02-14T14:14:16+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:14:29+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
df488eac055ee2eb7be24547f2dc284a262ae080 | # Dataset Card for "ultrafeedback_binarized_1707920039"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1707920039 | [
"region:us"
] | 2024-02-14T14:14:27+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 16973857, "num_examples": 1000}, {"name": "train_prefs", "num_bytes": 16589732, "num_examples": 1000}], "download_size": 12976788, "dataset_size": 33563589}} | 2024-02-14T14:14:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1707920039"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1707920039\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1707920039\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ultrafeedback_binarized_1707920039\"\n\nMore Information needed"
] |
dd7643ff685ea7392a6114b620bd0a71a6403ce1 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707920811 | [
"region:us"
] | 2024-02-14T14:27:10+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:27:15+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
164c9844db2b0ad7d901b9b0322865687426a2b1 | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707921252 | [
"region:us"
] | 2024-02-14T14:34:29+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_gen", "num_bytes": 30484069, "num_examples": 1000}, {"name": "test_sft", "num_bytes": 39592502, "num_examples": 1000}, {"name": "train_gen", "num_bytes": 29613744, "num_examples": 1000}, {"name": "train_sft", "num_bytes": 39521233, "num_examples": 1000}], "download_size": 50859072, "dataset_size": 139211548}} | 2024-02-14T14:34:34+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] | [
6,
3
] | [
"passage: TAGS\n#region-us \n# Args"
] |
3647c4dfb35e36607b8363f028f6d2e01625a518 | # Dataset Card for "ultrafeedback_binarized_1707921333"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1707921333 | [
"region:us"
] | 2024-02-14T14:36:02+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 16973857, "num_examples": 1000}, {"name": "train_prefs", "num_bytes": 16589732, "num_examples": 1000}], "download_size": 12976788, "dataset_size": 33563589}} | 2024-02-14T14:36:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1707921333"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1707921333\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1707921333\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ultrafeedback_binarized_1707921333\"\n\nMore Information needed"
] |
5c050f530069549402e039e56bf855e2e8eaf937 |
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T14:37:51.781058](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser/blob/main/results_2024-02-14T14-37-51.781058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.645893629951843,
"acc_stderr": 0.03226186920457953,
"acc_norm": 0.6452717180915718,
"acc_norm_stderr": 0.03293869736005958,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024884,
"mc2": 0.7814534342142354,
"mc2_stderr": 0.013720660446321574
},
"harness|arc:challenge|25": {
"acc": 0.6911262798634812,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274772
},
"harness|hellaswag|10": {
"acc": 0.7005576578370842,
"acc_stderr": 0.004570777326263903,
"acc_norm": 0.8850826528579964,
"acc_norm_stderr": 0.00318270383035113
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406762,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406762
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.01661139368726858,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.01661139368726858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079072,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079072
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080632,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080632
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024884,
"mc2": 0.7814534342142354,
"mc2_stderr": 0.013720660446321574
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser | [
"region:us"
] | 2024-02-14T14:40:14+00:00 | {"pretty_name": "Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T14:37:51.781058](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__NeuralTrix-7B-dpo-laser/blob/main/results_2024-02-14T14-37-51.781058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.645893629951843,\n \"acc_stderr\": 0.03226186920457953,\n \"acc_norm\": 0.6452717180915718,\n \"acc_norm_stderr\": 0.03293869736005958,\n \"mc1\": 0.6291309669522643,\n \"mc1_stderr\": 0.01690969358024884,\n \"mc2\": 0.7814534342142354,\n \"mc2_stderr\": 0.013720660446321574\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344003,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274772\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7005576578370842,\n \"acc_stderr\": 0.004570777326263903,\n \"acc_norm\": 0.8850826528579964,\n \"acc_norm_stderr\": 0.00318270383035113\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406762,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406762\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n \"acc_stderr\": 0.01661139368726858,\n \"acc_norm\": 0.4424581005586592,\n \"acc_norm_stderr\": 0.01661139368726858\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079072,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079072\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080632,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080632\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n \"mc1_stderr\": 0.01690969358024884,\n \"mc2\": 0.7814534342142354,\n \"mc2_stderr\": 0.013720660446321574\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078134\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|arc:challenge|25_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|gsm8k|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hellaswag|10_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["**/details_harness|winogrande|5_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T14-37-51.781058.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T14_37_51.781058", "path": ["results_2024-02-14T14-37-51.781058.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T14-37-51.781058.parquet"]}]}]} | 2024-02-14T14:40:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser
Dataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-laser on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T14:37:51.781058(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T14:37:51.781058(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T14:37:51.781058(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Kquant03/NeuralTrix-7B-dpo-laser\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/NeuralTrix-7B-dpo-laser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T14:37:51.781058(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
eeead9adfb57080fcd58808d0af11779ec2bf770 | Dataset for writing style transfer experimentation based on article:
https://ai-r.com/blog/pirate-linguistics-and-tone-of-voice-fine-tuning-llms-to-talk-like-swashbucklers
Only responses are in 'pirate speech' | TeeZee/dolly-15k-pirate-speech | [
"task_categories:question-answering",
"task_categories:summarization",
"task_categories:text-generation",
"language:en",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-02-14T14:53:38+00:00 | {"language": ["en"], "license": "cc-by-sa-3.0", "task_categories": ["question-answering", "summarization", "text-generation"], "pretty_name": "pirate speech"} | 2024-02-15T22:10:24+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #language-English #license-cc-by-sa-3.0 #region-us
| Dataset for writing style transfer experimentation based on article:
URL
Only responses are in 'pirate speech' | [] | [
"TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #language-English #license-cc-by-sa-3.0 #region-us \n"
] | [
54
] | [
"passage: TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #language-English #license-cc-by-sa-3.0 #region-us \n"
] |
83a5e0aac53d9122dfa45374dc3818955ed59191 |
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-11b-v1.3-m.1](https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T15:22:38.067991](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1/blob/main/results_2024-02-14T15-22-38.067991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4805808032934614,
"acc_stderr": 0.03425901458913262,
"acc_norm": 0.4858393351991251,
"acc_norm_stderr": 0.03502593471342456,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.46868611616690686,
"mc2_stderr": 0.015784113350451722
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5281569965870307,
"acc_norm_stderr": 0.014588204105102202
},
"harness|hellaswag|10": {
"acc": 0.5941047600079665,
"acc_stderr": 0.004900608529778612,
"acc_norm": 0.77504481179048,
"acc_norm_stderr": 0.004166994527570876
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112133,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112133
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790604,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790604
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6752293577981652,
"acc_stderr": 0.020077729109310327,
"acc_norm": 0.6752293577981652,
"acc_norm_stderr": 0.020077729109310327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891183,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303118,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2111731843575419,
"acc_stderr": 0.013650276794312202,
"acc_norm": 0.2111731843575419,
"acc_norm_stderr": 0.013650276794312202
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.01217730625278669,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.01217730625278669
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.02020351728026144,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.02020351728026144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.46868611616690686,
"mc2_stderr": 0.015784113350451722
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772733
},
"harness|gsm8k|5": {
"acc": 0.15011372251705837,
"acc_stderr": 0.009838590860906968
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1 | [
"region:us"
] | 2024-02-14T15:24:56+00:00 | {"pretty_name": "Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-11b-v1.3-m.1](https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T15:22:38.067991](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1/blob/main/results_2024-02-14T15-22-38.067991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4805808032934614,\n \"acc_stderr\": 0.03425901458913262,\n \"acc_norm\": 0.4858393351991251,\n \"acc_norm_stderr\": 0.03502593471342456,\n \"mc1\": 0.3023255813953488,\n \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.46868611616690686,\n \"mc2_stderr\": 0.015784113350451722\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.014588204105102202\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5941047600079665,\n \"acc_stderr\": 0.004900608529778612,\n \"acc_norm\": 0.77504481179048,\n \"acc_norm_stderr\": 0.004166994527570876\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112133,\n \"acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112133\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790604,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790604\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.03422398565657551,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.03422398565657551\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6752293577981652,\n \"acc_stderr\": 0.020077729109310327,\n \"acc_norm\": 0.6752293577981652,\n \"acc_norm_stderr\": 0.020077729109310327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.029872577708891183,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.029872577708891183\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303118,\n \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303118\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2111731843575419,\n \"acc_stderr\": 0.013650276794312202,\n \"acc_norm\": 0.2111731843575419,\n \"acc_norm_stderr\": 0.013650276794312202\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.02862930519400354,\n \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.02862930519400354\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n \"acc_stderr\": 0.01217730625278669,\n \"acc_norm\": 0.3494132985658409,\n \"acc_norm_stderr\": 0.01217730625278669\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.02020351728026144,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.02020351728026144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.46868611616690686,\n \"mc2_stderr\": 0.015784113350451722\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772733\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15011372251705837,\n \"acc_stderr\": 0.009838590860906968\n }\n}\n```", "repo_url": "https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["**/details_harness|winogrande|5_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T15-22-38.067991.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T15_22_38.067991", "path": ["results_2024-02-14T15-22-38.067991.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T15-22-38.067991.parquet"]}]}]} | 2024-02-14T15:25:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1
Dataset automatically created during the evaluation run of model davzoku/frankencria-llama2-11b-v1.3-m.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T15:22:38.067991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-11b-v1.3-m.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:22:38.067991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-11b-v1.3-m.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:22:38.067991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-11b-v1.3-m.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T15:22:38.067991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
195a03a7e3bc6d6a349b55986afba5319929c3bf | # Dataset Card for "PGM"
Dataset for the paper [Measuring abstract reasoning in neural networks
](https://arxiv.org/abs/1807.04225).
Only the `neutral` config is present at this point. | HuggingFaceM4/PGM | [
"arxiv:1807.04225",
"region:us"
] | 2024-02-14T15:41:07+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "relation_structure_encoded", "dtype": {"array2_d": {"shape": [4, 12], "dtype": "uint8"}}}, {"name": "relation_structure", "dtype": {"array2_d": {"shape": [1, 3], "dtype": "string"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 12], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 26850203831.0, "num_examples": 1200000}, {"name": "validation", "num_bytes": 602510542.0, "num_examples": 20000}, {"name": "test", "num_bytes": 4475789847.0, "num_examples": 200000}], "download_size": 44244925294, "dataset_size": 31928504220.0}} | 2024-02-15T07:07:47+00:00 | [
"1807.04225"
] | [] | TAGS
#arxiv-1807.04225 #region-us
| # Dataset Card for "PGM"
Dataset for the paper Measuring abstract reasoning in neural networks
.
Only the 'neutral' config is present at this point. | [
"# Dataset Card for \"PGM\"\n\nDataset for the paper Measuring abstract reasoning in neural networks\n.\nOnly the 'neutral' config is present at this point."
] | [
"TAGS\n#arxiv-1807.04225 #region-us \n",
"# Dataset Card for \"PGM\"\n\nDataset for the paper Measuring abstract reasoning in neural networks\n.\nOnly the 'neutral' config is present at this point."
] | [
14,
40
] | [
"passage: TAGS\n#arxiv-1807.04225 #region-us \n# Dataset Card for \"PGM\"\n\nDataset for the paper Measuring abstract reasoning in neural networks\n.\nOnly the 'neutral' config is present at this point."
] |
554294c91645098c7e482c8ec4bffa36985a57c7 |
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-12.5b-v1.3-m.2](https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T15:39:16.700250](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2/blob/main/results_2024-02-14T15-39-16.700250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4617722987057413,
"acc_stderr": 0.03442751213597903,
"acc_norm": 0.4686971200766007,
"acc_norm_stderr": 0.03527698452163269,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.5030678363563933,
"mc2_stderr": 0.01589753382807047
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.01460779491401305,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284743
},
"harness|hellaswag|10": {
"acc": 0.6093407687711612,
"acc_stderr": 0.0048690101522807505,
"acc_norm": 0.7916749651463851,
"acc_norm_stderr": 0.004052804959005537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962935,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868407,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868407
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829118,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829118
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953426,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953426
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.017041243143490974,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.017041243143490974
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21675977653631284,
"acc_stderr": 0.01378059848644335,
"acc_norm": 0.21675977653631284,
"acc_norm_stderr": 0.01378059848644335
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665232,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665232
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.012223623364044037,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.012223623364044037
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4448979591836735,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.4448979591836735,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5870646766169154,
"acc_stderr": 0.034815208033673474,
"acc_norm": 0.5870646766169154,
"acc_norm_stderr": 0.034815208033673474
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.5030678363563933,
"mc2_stderr": 0.01589753382807047
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614662
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.00500021260077329
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2 | [
"region:us"
] | 2024-02-14T15:41:34+00:00 | {"pretty_name": "Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-12.5b-v1.3-m.2](https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T15:39:16.700250](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2/blob/main/results_2024-02-14T15-39-16.700250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4617722987057413,\n \"acc_stderr\": 0.03442751213597903,\n \"acc_norm\": 0.4686971200766007,\n \"acc_norm_stderr\": 0.03527698452163269,\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.5030678363563933,\n \"mc2_stderr\": 0.01589753382807047\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.01460779491401305,\n \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284743\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6093407687711612,\n \"acc_stderr\": 0.0048690101522807505,\n \"acc_norm\": 0.7916749651463851,\n \"acc_norm_stderr\": 0.004052804959005537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n \"acc_stderr\": 0.028372287797962935,\n \"acc_norm\": 0.535483870967742,\n \"acc_norm_stderr\": 0.028372287797962935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868407,\n \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868407\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829118,\n \"acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829118\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953426,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953426\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n \"acc_stderr\": 0.017041243143490974,\n \"acc_norm\": 0.6513409961685823,\n \"acc_norm_stderr\": 0.017041243143490974\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21675977653631284,\n \"acc_stderr\": 0.01378059848644335,\n \"acc_norm\": 0.21675977653631284,\n \"acc_norm_stderr\": 0.01378059848644335\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089782,\n \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.02821768355665232,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.02821768355665232\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n \"acc_stderr\": 0.012223623364044037,\n \"acc_norm\": 0.35528031290743156,\n \"acc_norm_stderr\": 0.012223623364044037\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.020154685712590888,\n \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.020154685712590888\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n \"acc_stderr\": 0.034815208033673474,\n \"acc_norm\": 0.5870646766169154,\n \"acc_norm_stderr\": 0.034815208033673474\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.5030678363563933,\n \"mc2_stderr\": 0.01589753382807047\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614662\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.00500021260077329\n }\n}\n```", "repo_url": "https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["**/details_harness|winogrande|5_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T15-39-16.700250.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T15_39_16.700250", "path": ["results_2024-02-14T15-39-16.700250.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T15-39-16.700250.parquet"]}]}]} | 2024-02-14T15:41:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2
Dataset automatically created during the evaluation run of model davzoku/frankencria-llama2-12.5b-v1.3-m.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T15:39:16.700250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-12.5b-v1.3-m.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:39:16.700250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-12.5b-v1.3-m.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:39:16.700250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2\n\n\n\nDataset automatically created during the evaluation run of model davzoku/frankencria-llama2-12.5b-v1.3-m.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T15:39:16.700250(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
73ef4ee7d49a6fed4ea1efd65f82b4c95faeb9de | # Dataset Card for "vctk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sanchit-gandhi/vctk | [
"region:us"
] | 2024-02-14T15:48:55+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "speaker_id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 48000}}}, {"name": "file", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "text_id", "dtype": "string"}, {"name": "age", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "accent", "dtype": "string"}, {"name": "region", "dtype": "string"}, {"name": "comment", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13244661397.632, "num_examples": 88156}], "download_size": 11715533521, "dataset_size": 13244661397.632}} | 2024-02-14T16:00:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vctk"
More Information needed | [
"# Dataset Card for \"vctk\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vctk\"\n\nMore Information needed"
] | [
6,
12
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vctk\"\n\nMore Information needed"
] |
71121904c57d28147ad44e40da31d5cb84c97969 |
# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/EmertonMonarch-7B](https://huggingface.co/yleo/EmertonMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__EmertonMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T15:51:06.640306](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonMonarch-7B/blob/main/results_2024-02-14T15-51-06.640306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6464761697140718,
"acc_stderr": 0.03222241013310851,
"acc_norm": 0.6461858589602053,
"acc_norm_stderr": 0.03289455177300504,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7809489116779263,
"mc2_stderr": 0.013701734554887294
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.013318528460539422,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635751
},
"harness|hellaswag|10": {
"acc": 0.7185819557857,
"acc_stderr": 0.004487718843330278,
"acc_norm": 0.8915554670384386,
"acc_norm_stderr": 0.0031030554162430565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48435462842242505,
"acc_stderr": 0.012763982838120958,
"acc_norm": 0.48435462842242505,
"acc_norm_stderr": 0.012763982838120958
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7809489116779263,
"mc2_stderr": 0.013701734554887294
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__EmertonMonarch-7B | [
"region:us"
] | 2024-02-14T15:53:23+00:00 | {"pretty_name": "Evaluation run of yleo/EmertonMonarch-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/EmertonMonarch-7B](https://huggingface.co/yleo/EmertonMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__EmertonMonarch-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T15:51:06.640306](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonMonarch-7B/blob/main/results_2024-02-14T15-51-06.640306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6464761697140718,\n \"acc_stderr\": 0.03222241013310851,\n \"acc_norm\": 0.6461858589602053,\n \"acc_norm_stderr\": 0.03289455177300504,\n \"mc1\": 0.6291309669522643,\n \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7809489116779263,\n \"mc2_stderr\": 0.013701734554887294\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539422,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635751\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7185819557857,\n \"acc_stderr\": 0.004487718843330278,\n \"acc_norm\": 0.8915554670384386,\n \"acc_norm_stderr\": 0.0031030554162430565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257796,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257796\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48435462842242505,\n \"acc_stderr\": 0.012763982838120958,\n \"acc_norm\": 0.48435462842242505,\n \"acc_norm_stderr\": 0.012763982838120958\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7809489116779263,\n \"mc2_stderr\": 0.013701734554887294\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146877\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/EmertonMonarch-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T15-51-06.640306.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["**/details_harness|winogrande|5_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T15-51-06.640306.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T15_51_06.640306", "path": ["results_2024-02-14T15-51-06.640306.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T15-51-06.640306.parquet"]}]}]} | 2024-02-14T15:53:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B
Dataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T15:51:06.640306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:51:06.640306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T15:51:06.640306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T15:51:06.640306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b83c5f0c18d3f39f3ed13b25229476bb209377b4 |
# Dataset Card for Evaluation run of mlabonne/AlphaMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__AlphaMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T16:09:07.620660](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__AlphaMonarch-7B/blob/main/results_2024-02-14T16-09-07.620660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.64993303036545,
"acc_stderr": 0.03223275833989727,
"acc_norm": 0.6496784512587302,
"acc_norm_stderr": 0.032904185346138724,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7790976929959287,
"mc2_stderr": 0.01374934735921556
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.01332975029338232,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869147
},
"harness|hellaswag|10": {
"acc": 0.7181836287592113,
"acc_stderr": 0.004489648865080874,
"acc_norm": 0.8917546305516829,
"acc_norm_stderr": 0.0031005509089161993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4817470664928292,
"acc_stderr": 0.012761723960595472,
"acc_norm": 0.4817470664928292,
"acc_norm_stderr": 0.012761723960595472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7790976929959287,
"mc2_stderr": 0.01374934735921556
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272956
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.012979892496598287
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__AlphaMonarch-7B | [
"region:us"
] | 2024-02-14T16:11:24+00:00 | {"pretty_name": "Evaluation run of mlabonne/AlphaMonarch-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__AlphaMonarch-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T16:09:07.620660](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__AlphaMonarch-7B/blob/main/results_2024-02-14T16-09-07.620660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64993303036545,\n \"acc_stderr\": 0.03223275833989727,\n \"acc_norm\": 0.6496784512587302,\n \"acc_norm_stderr\": 0.032904185346138724,\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7790976929959287,\n \"mc2_stderr\": 0.01374934735921556\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.01332975029338232,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7181836287592113,\n \"acc_stderr\": 0.004489648865080874,\n \"acc_norm\": 0.8917546305516829,\n \"acc_norm_stderr\": 0.0031005509089161993\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4817470664928292,\n \"acc_stderr\": 0.012761723960595472,\n \"acc_norm\": 0.4817470664928292,\n \"acc_norm_stderr\": 0.012761723960595472\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7790976929959287,\n \"mc2_stderr\": 0.01374934735921556\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \"acc_stderr\": 0.012979892496598287\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/AlphaMonarch-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-09-07.620660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["**/details_harness|winogrande|5_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T16-09-07.620660.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T16_09_07.620660", "path": ["results_2024-02-14T16-09-07.620660.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T16-09-07.620660.parquet"]}]}]} | 2024-02-14T16:11:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/AlphaMonarch-7B
Dataset automatically created during the evaluation run of model mlabonne/AlphaMonarch-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T16:09:07.620660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/AlphaMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/AlphaMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T16:09:07.620660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/AlphaMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/AlphaMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T16:09:07.620660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/AlphaMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/AlphaMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T16:09:07.620660(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
fc505f61bf1079476ae7dd0c61fa9f821387fd37 |
# Dataset Card for Cantonese XStoryCloze
This dataset is a Cantonese translation of the Simplified Chinese subset of [juletxara/xstory_cloze](https://huggingface.co/datasets/juletxara/xstory_cloze). For more detailed information about the original dataset, please refer to the provided link.
This dataset is translated by indiejoseph/bart-translation-zh-yue and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.
## Sample
```json
{
"input_sentence_1": "瑞克喺一個坎坷動盪嘅家庭出世。",
"input_sentence_2": "佢從來都冇喺屋企得到有力支持,所以轉而投身幫派。",
"input_sentence_3": "冇耐前瑞克喺一單搶劫案中中彈。",
"input_sentence_4": "呢件事令佢改過自新、重新做人。",
"sentence_quiz1": "佢而家好開心。",
"sentence_quiz2": "佢加入咗一個幫派。",
"story_id": "138d5bfb-05cc-41e3-bf2c-fa85ebad14e2",
"answer_right_ending": 1
}
```
## License
This dataset is provided under the same license as the original dataset: CC BY-SA 4.0
## Limitation and Usage Limits
Please check the original dataset for more information.
| hon9kon9ize/yue_xstory_cloze | [
"language:yue",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-02-14T16:17:38+00:00 | {"language": ["yue"], "license": "cc-by-sa-4.0"} | 2024-02-14T16:33:37+00:00 | [] | [
"yue"
] | TAGS
#language-Yue Chinese #license-cc-by-sa-4.0 #region-us
|
# Dataset Card for Cantonese XStoryCloze
This dataset is a Cantonese translation of the Simplified Chinese subset of juletxara/xstory_cloze. For more detailed information about the original dataset, please refer to the provided link.
This dataset is translated by indiejoseph/bart-translation-zh-yue and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.
## Sample
## License
This dataset is provided under the same license as the original dataset: CC BY-SA 4.0
## Limitation and Usage Limits
Please check the original dataset for more information.
| [
"# Dataset Card for Cantonese XStoryCloze\n\nThis dataset is a Cantonese translation of the Simplified Chinese subset of juletxara/xstory_cloze. For more detailed information about the original dataset, please refer to the provided link.\n\nThis dataset is translated by indiejoseph/bart-translation-zh-yue and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.",
"## Sample",
"## License\n\nThis dataset is provided under the same license as the original dataset: CC BY-SA 4.0",
"## Limitation and Usage Limits\n\nPlease check the original dataset for more information."
] | [
"TAGS\n#language-Yue Chinese #license-cc-by-sa-4.0 #region-us \n",
"# Dataset Card for Cantonese XStoryCloze\n\nThis dataset is a Cantonese translation of the Simplified Chinese subset of juletxara/xstory_cloze. For more detailed information about the original dataset, please refer to the provided link.\n\nThis dataset is translated by indiejoseph/bart-translation-zh-yue and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.",
"## Sample",
"## License\n\nThis dataset is provided under the same license as the original dataset: CC BY-SA 4.0",
"## Limitation and Usage Limits\n\nPlease check the original dataset for more information."
] | [
23,
114,
3,
22,
18
] | [
"passage: TAGS\n#language-Yue Chinese #license-cc-by-sa-4.0 #region-us \n# Dataset Card for Cantonese XStoryCloze\n\nThis dataset is a Cantonese translation of the Simplified Chinese subset of juletxara/xstory_cloze. For more detailed information about the original dataset, please refer to the provided link.\n\nThis dataset is translated by indiejoseph/bart-translation-zh-yue and has not undergone any manual verification. The content may be inaccurate or misleading. please keep this in mind when using this dataset.## Sample## License\n\nThis dataset is provided under the same license as the original dataset: CC BY-SA 4.0## Limitation and Usage Limits\n\nPlease check the original dataset for more information."
] |
48144cc0203926bc18eea5a76463d2f91f19b435 |
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b-v1](https://huggingface.co/arlineka/Brunhilde-13b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T16:23:09.568559](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1/blob/main/results_2024-02-14T16-23-09.568559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5526426364814372,
"acc_stderr": 0.03369431998877289,
"acc_norm": 0.5589867767589362,
"acc_norm_stderr": 0.03442255189082874,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5198242782926871,
"mc2_stderr": 0.015737761660039342
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996074,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6427006572395937,
"acc_stderr": 0.004782246931194997,
"acc_norm": 0.8357896833300139,
"acc_norm_stderr": 0.0036970918376320718
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178277,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178277
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.01930424349770715,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.01930424349770715
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101072,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101072
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.02611374936131035,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.02611374936131035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970473,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534423,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5198242782926871,
"mc2_stderr": 0.015737761660039342
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
},
"harness|gsm8k|5": {
"acc": 0.20090978013646701,
"acc_stderr": 0.011036738221872358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1 | [
"region:us"
] | 2024-02-14T16:25:26+00:00 | {"pretty_name": "Evaluation run of arlineka/Brunhilde-13b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b-v1](https://huggingface.co/arlineka/Brunhilde-13b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T16:23:09.568559](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1/blob/main/results_2024-02-14T16-23-09.568559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5526426364814372,\n \"acc_stderr\": 0.03369431998877289,\n \"acc_norm\": 0.5589867767589362,\n \"acc_norm_stderr\": 0.03442255189082874,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5198242782926871,\n \"mc2_stderr\": 0.015737761660039342\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996074,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6427006572395937,\n \"acc_stderr\": 0.004782246931194997,\n \"acc_norm\": 0.8357896833300139,\n \"acc_norm_stderr\": 0.0036970918376320718\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.030635627957961823,\n \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.030635627957961823\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178277,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178277\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236153,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7174311926605504,\n \"acc_stderr\": 0.01930424349770715,\n \"acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.01930424349770715\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101072,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101072\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131035,\n \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534423,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5198242782926871,\n \"mc2_stderr\": 0.015737761660039342\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20090978013646701,\n \"acc_stderr\": 0.011036738221872358\n }\n}\n```", "repo_url": "https://huggingface.co/arlineka/Brunhilde-13b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-23-09.568559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["**/details_harness|winogrande|5_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T16-23-09.568559.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T16_23_09.568559", "path": ["results_2024-02-14T16-23-09.568559.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T16-23-09.568559.parquet"]}]}]} | 2024-02-14T16:25:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v1
Dataset automatically created during the evaluation run of model arlineka/Brunhilde-13b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T16:23:09.568559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v1\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T16:23:09.568559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v1\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T16:23:09.568559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v1\n\n\n\nDataset automatically created during the evaluation run of model arlineka/Brunhilde-13b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T16:23:09.568559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
dd3340d273fbc0da47bc96f7c0acbd19baa6f94d |
# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/EmertonMonarch-7B-slerp](https://huggingface.co/yleo/EmertonMonarch-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__EmertonMonarch-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T17:09:33.259511](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonMonarch-7B-slerp/blob/main/results_2024-02-14T17-09-33.259511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501150387509795,
"acc_stderr": 0.03215193957481397,
"acc_norm": 0.6499843763084299,
"acc_norm_stderr": 0.032817026545135845,
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.765485798344535,
"mc2_stderr": 0.013934106144304993
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551706,
"acc_norm": 0.8893646683927504,
"acc_norm_stderr": 0.0031303894668332022
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.04006485685365342,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.04006485685365342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.01648278218750067,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.01648278218750067
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863933,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.765485798344535,
"mc2_stderr": 0.013934106144304993
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251651
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__EmertonMonarch-7B-slerp | [
"region:us"
] | 2024-02-14T16:30:31+00:00 | {"pretty_name": "Evaluation run of yleo/EmertonMonarch-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/EmertonMonarch-7B-slerp](https://huggingface.co/yleo/EmertonMonarch-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__EmertonMonarch-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T17:09:33.259511](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__EmertonMonarch-7B-slerp/blob/main/results_2024-02-14T17-09-33.259511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501150387509795,\n \"acc_stderr\": 0.03215193957481397,\n \"acc_norm\": 0.6499843763084299,\n \"acc_norm_stderr\": 0.032817026545135845,\n \"mc1\": 0.6070991432068543,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.765485798344535,\n \"mc2_stderr\": 0.013934106144304993\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n \"acc_stderr\": 0.004525960965551706,\n \"acc_norm\": 0.8893646683927504,\n \"acc_norm_stderr\": 0.0031303894668332022\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.40397350993377484,\n \"acc_stderr\": 0.04006485685365342,\n \"acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.04006485685365342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.01648278218750067,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.01648278218750067\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863933,\n \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863933\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6070991432068543,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.765485798344535,\n \"mc2_stderr\": 0.013934106144304993\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \"acc_stderr\": 0.012840345676251651\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/EmertonMonarch-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|arc:challenge|25_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|gsm8k|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hellaswag|10_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-28-10.749584.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T16-46-48.387931.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-09-33.259511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["**/details_harness|winogrande|5_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["**/details_harness|winogrande|5_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["**/details_harness|winogrande|5_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T17-09-33.259511.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T16_28_10.749584", "path": ["results_2024-02-14T16-28-10.749584.parquet"]}, {"split": "2024_02_14T16_46_48.387931", "path": ["results_2024-02-14T16-46-48.387931.parquet"]}, {"split": "2024_02_14T17_09_33.259511", "path": ["results_2024-02-14T17-09-33.259511.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T17-09-33.259511.parquet"]}]}]} | 2024-02-14T17:11:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B-slerp
Dataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T17:09:33.259511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:09:33.259511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:09:33.259511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yleo/EmertonMonarch-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model yleo/EmertonMonarch-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T17:09:33.259511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
5a69b621fe86020a2413bea863d014c895d83cd3 | # Dataset Card for "wsd_myriade_synth_data_id_label"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_id_label | [
"region:us"
] | 2024-02-14T16:38:45+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 57036817, "num_examples": 101321}], "download_size": 9970599, "dataset_size": 57036817}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T08:52:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_id_label"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_id_label\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_id_label\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"wsd_myriade_synth_data_id_label\"\n\nMore Information needed"
] |
7a3a3eb0d2183970790b4220420280f37cd5566e | Contains labelled smart contracts | bitaudit/audit_verification_dataset | [
"license:mit",
"code",
"region:us"
] | 2024-02-14T16:58:24+00:00 | {"license": "mit", "tags": ["code"]} | 2024-02-14T17:01:06+00:00 | [] | [] | TAGS
#license-mit #code #region-us
| Contains labelled smart contracts | [] | [
"TAGS\n#license-mit #code #region-us \n"
] | [
13
] | [
"passage: TAGS\n#license-mit #code #region-us \n"
] |
7bbf1d6c8a8bff3cc3dfd58c812b63a46720ac07 |
# Dataset Card for Evaluation run of yleo/OgnoMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/OgnoMonarch-7B](https://huggingface.co/yleo/OgnoMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__OgnoMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T17:03:12.176917](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__OgnoMonarch-7B/blob/main/results_2024-02-14T17-03-12.176917.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6551821905597185,
"acc_stderr": 0.03206589629652231,
"acc_norm": 0.6547282038076654,
"acc_norm_stderr": 0.032735903511765224,
"mc1": 0.6070991432068543,
"mc1_stderr": 0.01709724828523307,
"mc2": 0.7706471953940155,
"mc2_stderr": 0.013780753973380088
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244482,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.7075283808006373,
"acc_stderr": 0.004539680764142179,
"acc_norm": 0.8891655048795061,
"acc_norm_stderr": 0.0031328549889236587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059004,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059004
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580428,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038913,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038913
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6070991432068543,
"mc1_stderr": 0.01709724828523307,
"mc2": 0.7706471953940155,
"mc2_stderr": 0.013780753973380088
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__OgnoMonarch-7B | [
"region:us"
] | 2024-02-14T17:05:29+00:00 | {"pretty_name": "Evaluation run of yleo/OgnoMonarch-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/OgnoMonarch-7B](https://huggingface.co/yleo/OgnoMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__OgnoMonarch-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T17:03:12.176917](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__OgnoMonarch-7B/blob/main/results_2024-02-14T17-03-12.176917.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6551821905597185,\n \"acc_stderr\": 0.03206589629652231,\n \"acc_norm\": 0.6547282038076654,\n \"acc_norm_stderr\": 0.032735903511765224,\n \"mc1\": 0.6070991432068543,\n \"mc1_stderr\": 0.01709724828523307,\n \"mc2\": 0.7706471953940155,\n \"mc2_stderr\": 0.013780753973380088\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244482,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7075283808006373,\n \"acc_stderr\": 0.004539680764142179,\n \"acc_norm\": 0.8891655048795061,\n \"acc_norm_stderr\": 0.0031328549889236587\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059004,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059004\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038913,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038913\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6070991432068543,\n \"mc1_stderr\": 0.01709724828523307,\n \"mc2\": 0.7706471953940155,\n \"mc2_stderr\": 0.013780753973380088\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \"acc_stderr\": 0.012616300735519656\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/OgnoMonarch-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-03-12.176917.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["**/details_harness|winogrande|5_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T17-03-12.176917.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T17_03_12.176917", "path": ["results_2024-02-14T17-03-12.176917.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T17-03-12.176917.parquet"]}]}]} | 2024-02-14T17:05:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/OgnoMonarch-7B
Dataset automatically created during the evaluation run of model yleo/OgnoMonarch-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T17:03:12.176917(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/OgnoMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/OgnoMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:03:12.176917(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/OgnoMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/OgnoMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:03:12.176917(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yleo/OgnoMonarch-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/OgnoMonarch-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-02-14T17:03:12.176917(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
dec6d5d1df04a0c7fdc67d2c44086fa9520c39dc | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | 3it/bitaudit_verification_dataset | [
"license:mit",
"code",
"region:us"
] | 2024-02-14T17:19:45+00:00 | {"license": "mit", "tags": ["code"]} | 2024-02-14T17:21:34+00:00 | [] | [] | TAGS
#license-mit #code #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-mit #code #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
13,
34,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#license-mit #code #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
98072fd85f4c6a3b4e8fbdbc4d9a6cc9401cae8a |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 3,327 | 9,104 | 3,703 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-citeseer | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:23:02+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:26:46+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
277a0e58bb69c2712df47074e9067b4c265f1892 |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 11,701 | 216,123 | 300 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-wiki-cs | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:28:16+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:31:39+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
ba26a22efa6033ca6b7a77c6b68c03eb1de8753d |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 18,333 | 81,894 | 6,805 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-coauthor-cs | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:34:11+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:37:49+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
8214e09e962973b6c4065773033aedf68df3a3ce |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 13,752 | 245,778 | 767 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-amazon-computer | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:39:42+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:42:52+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
b80ed009de98eea141623c4690f4ddbbc1b8ec2a |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 7,650 | 119,043 | 745 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-amazon-photo | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:43:40+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:46:23+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] | [
41
] | [
"passage: TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
Subsets and Splits