sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
70cb4b593d0687b1b7e7b03f6380604e65e68e5f |
# Dataset Card for Evaluation run of giraffe176/Open_Hermes_Orca_Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Hermes_Orca_Mistral-7B](https://huggingface.co/giraffe176/Open_Hermes_Orca_Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Hermes_Orca_Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T08:58:14.779696](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Hermes_Orca_Mistral-7B/blob/main/results_2024-02-11T08-58-14.779696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.640271253808845,
"acc_stderr": 0.03218451098723128,
"acc_norm": 0.6429009887255552,
"acc_norm_stderr": 0.03282441851605668,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5334094436292038,
"mc2_stderr": 0.015404491531393148
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257187,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840058
},
"harness|hellaswag|10": {
"acc": 0.657837084246166,
"acc_stderr": 0.004734642167493352,
"acc_norm": 0.8463453495319657,
"acc_norm_stderr": 0.0035988038554606344
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406776,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406776
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.5334094436292038,
"mc2_stderr": 0.015404491531393148
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5617892342683851,
"acc_stderr": 0.013666915917255072
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_giraffe176__Open_Hermes_Orca_Mistral-7B | [
"region:us"
] | 2024-02-11T09:00:40+00:00 | {"pretty_name": "Evaluation run of giraffe176/Open_Hermes_Orca_Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [giraffe176/Open_Hermes_Orca_Mistral-7B](https://huggingface.co/giraffe176/Open_Hermes_Orca_Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Hermes_Orca_Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T08:58:14.779696](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Hermes_Orca_Mistral-7B/blob/main/results_2024-02-11T08-58-14.779696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.640271253808845,\n \"acc_stderr\": 0.03218451098723128,\n \"acc_norm\": 0.6429009887255552,\n \"acc_norm_stderr\": 0.03282441851605668,\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5334094436292038,\n \"mc2_stderr\": 0.015404491531393148\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257187,\n \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840058\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657837084246166,\n \"acc_stderr\": 0.004734642167493352,\n \"acc_norm\": 0.8463453495319657,\n \"acc_norm_stderr\": 0.0035988038554606344\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406776,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406776\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.5334094436292038,\n \"mc2_stderr\": 0.015404491531393148\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5617892342683851,\n \"acc_stderr\": 0.013666915917255072\n }\n}\n```", "repo_url": "https://huggingface.co/giraffe176/Open_Hermes_Orca_Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|arc:challenge|25_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|gsm8k|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hellaswag|10_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T08-58-14.779696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["**/details_harness|winogrande|5_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T08-58-14.779696.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T08_58_14.779696", "path": ["results_2024-02-11T08-58-14.779696.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T08-58-14.779696.parquet"]}]}]} | 2024-02-11T09:01:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of giraffe176/Open_Hermes_Orca_Mistral-7B
Dataset automatically created during the evaluation run of model giraffe176/Open_Hermes_Orca_Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T08:58:14.779696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of giraffe176/Open_Hermes_Orca_Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Hermes_Orca_Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T08:58:14.779696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of giraffe176/Open_Hermes_Orca_Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model giraffe176/Open_Hermes_Orca_Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T08:58:14.779696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
836a9a79e28c6e598a5c308de7f047613dcf9ec5 |
# unofficial mirror of InfoRe Technology public dataset №2
official announcement: https://www.facebook.com/groups/j2team.community/permalink/1010834009248719/
415h, 315k samples, vietnamese audiobooks of chinese wǔxiá 武俠 & xiānxiá 仙俠
bộ dữ liệu bóc ra từ YouTube đọc truyện võ hiệp & tiên hiệp, áp dụng kĩ thuật đối chiếu văn bản để dán nhãn tự động
official download: `magnet:?xt=urn:btih:41f1290325ecb6f1230ecdff2441527c9cd43fd0&dn=audiobooks.zip&tr=http%3A%2F%2Foffice.socials.vn%3A8725%2Fannounce`
mirror: https://files.huylenguyen.com/audiobooks.zip
unzip password: `BroughtToYouByInfoRe`
pre-process: none
need to do: check misspelling
usage with HuggingFace:
```python
# pip install -q "datasets[audio]"
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset = load_dataset("doof-ferb/infore2_audiobooks", split="train", streaming=True)
dataset.set_format(type="torch", columns=["audio", "transcription"])
dataloader = DataLoader(dataset, batch_size=4)
``` | doof-ferb/infore2_audiobooks | [
"task_categories:automatic-speech-recognition",
"task_categories:text-to-speech",
"size_categories:100K<n<1M",
"language:vi",
"license:cc-by-4.0",
"region:us"
] | 2024-02-11T09:11:28+00:00 | {"language": ["vi"], "license": "cc-by-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["automatic-speech-recognition", "text-to-speech"], "pretty_name": "InfoRe Technology public dataset \u21162", "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 55377534543.241, "num_examples": 315449}], "download_size": 46594653323, "dataset_size": 55377534543.241}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T18:43:29+00:00 | [] | [
"vi"
] | TAGS
#task_categories-automatic-speech-recognition #task_categories-text-to-speech #size_categories-100K<n<1M #language-Vietnamese #license-cc-by-4.0 #region-us
|
# unofficial mirror of InfoRe Technology public dataset №2
official announcement: URL
415h, 315k samples, vietnamese audiobooks of chinese wǔxiá 武俠 & xiānxiá 仙俠
bộ dữ liệu bóc ra từ YouTube đọc truyện võ hiệp & tiên hiệp, áp dụng kĩ thuật đối chiếu văn bản để dán nhãn tự động
official download: 'magnet:?xt=urn:btih:41f1290325ecb6f1230ecdff2441527c9cd43fd0&dn=URL&tr=http%3A%2F%URL%3A8725%2Fannounce'
mirror: URL
unzip password: 'BroughtToYouByInfoRe'
pre-process: none
need to do: check misspelling
usage with HuggingFace:
| [
"# unofficial mirror of InfoRe Technology public dataset №2\n\nofficial announcement: URL\n\n415h, 315k samples, vietnamese audiobooks of chinese wǔxiá 武俠 & xiānxiá 仙俠\n\nbộ dữ liệu bóc ra từ YouTube đọc truyện võ hiệp & tiên hiệp, áp dụng kĩ thuật đối chiếu văn bản để dán nhãn tự động\n\nofficial download: 'magnet:?xt=urn:btih:41f1290325ecb6f1230ecdff2441527c9cd43fd0&dn=URL&tr=http%3A%2F%URL%3A8725%2Fannounce'\n\nmirror: URL\n\nunzip password: 'BroughtToYouByInfoRe'\n\npre-process: none\n\nneed to do: check misspelling\n\nusage with HuggingFace:"
] | [
"TAGS\n#task_categories-automatic-speech-recognition #task_categories-text-to-speech #size_categories-100K<n<1M #language-Vietnamese #license-cc-by-4.0 #region-us \n",
"# unofficial mirror of InfoRe Technology public dataset №2\n\nofficial announcement: URL\n\n415h, 315k samples, vietnamese audiobooks of chinese wǔxiá 武俠 & xiānxiá 仙俠\n\nbộ dữ liệu bóc ra từ YouTube đọc truyện võ hiệp & tiên hiệp, áp dụng kĩ thuật đối chiếu văn bản để dán nhãn tự động\n\nofficial download: 'magnet:?xt=urn:btih:41f1290325ecb6f1230ecdff2441527c9cd43fd0&dn=URL&tr=http%3A%2F%URL%3A8725%2Fannounce'\n\nmirror: URL\n\nunzip password: 'BroughtToYouByInfoRe'\n\npre-process: none\n\nneed to do: check misspelling\n\nusage with HuggingFace:"
] |
4e7be1cab6bb1ba45ae67fcf03b1f1235039f522 | # Dataset Card for "IEMOCAP_Speech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tarasabkar/IEMOCAP_Speech | [
"region:us"
] | 2024-02-11T09:29:55+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "emotion", "dtype": {"class_label": {"names": {"0": "ang", "1": "hap", "2": "neu", "3": "sad"}}}}], "splits": [{"name": "Session1", "num_bytes": 167102058.95, "num_examples": 1085}, {"name": "Session2", "num_bytes": 150799933.454, "num_examples": 1023}, {"name": "Session3", "num_bytes": 167088514.51, "num_examples": 1151}, {"name": "Session4", "num_bytes": 145505839.808, "num_examples": 1031}, {"name": "Session5", "num_bytes": 170307009.46, "num_examples": 1241}], "download_size": 788399921, "dataset_size": 800803356.182}} | 2024-02-11T09:58:01+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "IEMOCAP_Speech"
More Information needed | [
"# Dataset Card for \"IEMOCAP_Speech\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"IEMOCAP_Speech\"\n\nMore Information needed"
] |
7f9b52e6ceb1e75065d31d135dd7e08088e8b69f |
# Dataset Card for DWD Observations
<!-- Provide a quick summary of the dataset. -->
This dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been
converted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | jacobbieker/dwd | [
"license:mit",
"climate",
"region:us"
] | 2024-02-11T09:35:28+00:00 | {"license": "mit", "tags": ["climate"]} | 2024-02-13T13:36:45+00:00 | [] | [] | TAGS
#license-mit #climate #region-us
|
# Dataset Card for DWD Observations
This dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been
converted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for DWD Observations\n\n\n\nThis dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been\nconverted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-mit #climate #region-us \n",
"# Dataset Card for DWD Observations\n\n\n\nThis dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been\nconverted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
af300b660661e2282d6057d6fd0d86f76d5569aa |
## Includes a part of OpenOrca dataset in Turkish language
The Subset of OpenOrca dataset in turkish language comprises 798350 pairs of questions and answers in Turkish,
predominantly translated from English using Google Translate.
Wherever possible, specific terminology and unique names were retained unchanged in the translation process.
Feel free to submit pull requests to enhance the quality of the dataset.
Contact: https://www.linkedin.com/in/ugur-cekmez/ | ucekmez/OpenOrca-tr | [
"language:tr",
"license:mit",
"region:us"
] | 2024-02-11T09:47:51+00:00 | {"language": ["tr"], "license": "mit"} | 2024-02-11T14:23:41+00:00 | [] | [
"tr"
] | TAGS
#language-Turkish #license-mit #region-us
|
## Includes a part of OpenOrca dataset in Turkish language
The Subset of OpenOrca dataset in turkish language comprises 798350 pairs of questions and answers in Turkish,
predominantly translated from English using Google Translate.
Wherever possible, specific terminology and unique names were retained unchanged in the translation process.
Feel free to submit pull requests to enhance the quality of the dataset.
Contact: URL | [
"## Includes a part of OpenOrca dataset in Turkish language\n\nThe Subset of OpenOrca dataset in turkish language comprises 798350 pairs of questions and answers in Turkish, \npredominantly translated from English using Google Translate. \nWherever possible, specific terminology and unique names were retained unchanged in the translation process.\n\n\nFeel free to submit pull requests to enhance the quality of the dataset.\n\nContact: URL"
] | [
"TAGS\n#language-Turkish #license-mit #region-us \n",
"## Includes a part of OpenOrca dataset in Turkish language\n\nThe Subset of OpenOrca dataset in turkish language comprises 798350 pairs of questions and answers in Turkish, \npredominantly translated from English using Google Translate. \nWherever possible, specific terminology and unique names were retained unchanged in the translation process.\n\n\nFeel free to submit pull requests to enhance the quality of the dataset.\n\nContact: URL"
] |
b0fa6c0edc46677138ef95b2ff0f017727de0f41 |
# Dataset Card for Evaluation run of CorticalStack/travel-mistral-7B-16b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/travel-mistral-7B-16b-base](https://huggingface.co/CorticalStack/travel-mistral-7B-16b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__travel-mistral-7B-16b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T09:58:07.096782](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__travel-mistral-7B-16b-base/blob/main/results_2024-02-11T09-58-07.096782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.62333273914223,
"acc_stderr": 0.032635939156381126,
"acc_norm": 0.6288825536861529,
"acc_norm_stderr": 0.03329783376871095,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5323211184642095,
"mc2_stderr": 0.015107868373889385
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520767,
"acc_norm": 0.6143344709897611,
"acc_norm_stderr": 0.014224250973257182
},
"harness|hellaswag|10": {
"acc": 0.6314479187412866,
"acc_stderr": 0.0048142619663768494,
"acc_norm": 0.8350926110336586,
"acc_norm_stderr": 0.0037033852685121734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.03074630074212451,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.03074630074212451
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062146,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062146
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406943,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217576,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632938,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632938
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379776,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379776
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5323211184642095,
"mc2_stderr": 0.015107868373889385
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345391
},
"harness|gsm8k|5": {
"acc": 0.37680060652009095,
"acc_stderr": 0.013347858757829158
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__travel-mistral-7B-16b-base | [
"region:us"
] | 2024-02-11T10:00:33+00:00 | {"pretty_name": "Evaluation run of CorticalStack/travel-mistral-7B-16b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/travel-mistral-7B-16b-base](https://huggingface.co/CorticalStack/travel-mistral-7B-16b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__travel-mistral-7B-16b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T09:58:07.096782](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__travel-mistral-7B-16b-base/blob/main/results_2024-02-11T09-58-07.096782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.62333273914223,\n \"acc_stderr\": 0.032635939156381126,\n \"acc_norm\": 0.6288825536861529,\n \"acc_norm_stderr\": 0.03329783376871095,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5323211184642095,\n \"mc2_stderr\": 0.015107868373889385\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520767,\n \"acc_norm\": 0.6143344709897611,\n \"acc_norm_stderr\": 0.014224250973257182\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6314479187412866,\n \"acc_stderr\": 0.0048142619663768494,\n \"acc_norm\": 0.8350926110336586,\n \"acc_norm_stderr\": 0.0037033852685121734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764812,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764812\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.03074630074212451,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.03074630074212451\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062146,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062146\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217576,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217576\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537375,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537375\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379776,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379776\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786558,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786558\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5323211184642095,\n \"mc2_stderr\": 0.015107868373889385\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345391\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37680060652009095,\n \"acc_stderr\": 0.013347858757829158\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/travel-mistral-7B-16b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|arc:challenge|25_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|gsm8k|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hellaswag|10_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T09-58-07.096782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["**/details_harness|winogrande|5_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T09-58-07.096782.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T09_58_07.096782", "path": ["results_2024-02-11T09-58-07.096782.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T09-58-07.096782.parquet"]}]}]} | 2024-02-11T10:00:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CorticalStack/travel-mistral-7B-16b-base
Dataset automatically created during the evaluation run of model CorticalStack/travel-mistral-7B-16b-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T09:58:07.096782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of CorticalStack/travel-mistral-7B-16b-base\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/travel-mistral-7B-16b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T09:58:07.096782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CorticalStack/travel-mistral-7B-16b-base\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/travel-mistral-7B-16b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T09:58:07.096782(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0ea0df0c9830bdfd18857649645df235cfc560b3 |
# Dataset Card for Evaluation run of FreedomIntelligence/AceGPT-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FreedomIntelligence/AceGPT-7B](https://huggingface.co/FreedomIntelligence/AceGPT-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FreedomIntelligence__AceGPT-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T10:08:53.016529](https://huggingface.co/datasets/open-llm-leaderboard/details_FreedomIntelligence__AceGPT-7B/blob/main/results_2024-02-11T10-08-53.016529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4331306075709672,
"acc_stderr": 0.03424883157582962,
"acc_norm": 0.4376724571110185,
"acc_norm_stderr": 0.03503717854163451,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015018,
"mc2": 0.3874927202329178,
"mc2_stderr": 0.013639417434393192
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956952,
"acc_norm": 0.5358361774744027,
"acc_norm_stderr": 0.01457381366473572
},
"harness|hellaswag|10": {
"acc": 0.5746863174666401,
"acc_stderr": 0.004933800927560531,
"acc_norm": 0.7754431388169687,
"acc_norm_stderr": 0.0041643733628592815
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416908,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416908
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.037082846624165444,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.037082846624165444
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730585,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730585
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.03903698647748441,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.03903698647748441
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5743119266055046,
"acc_stderr": 0.0211992359724708,
"acc_norm": 0.5743119266055046,
"acc_norm_stderr": 0.0211992359724708
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046948,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046948
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.484304932735426,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.484304932735426,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.031937057262002924,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.031937057262002924
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6040868454661558,
"acc_stderr": 0.017488247006979266,
"acc_norm": 0.6040868454661558,
"acc_norm_stderr": 0.017488247006979266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.026788811931562757,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.026788811931562757
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438885,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438885
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4542483660130719,
"acc_stderr": 0.02850980780262656,
"acc_norm": 0.4542483660130719,
"acc_norm_stderr": 0.02850980780262656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4983922829581994,
"acc_stderr": 0.02839794490780661,
"acc_norm": 0.4983922829581994,
"acc_norm_stderr": 0.02839794490780661
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878638,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31747066492829207,
"acc_stderr": 0.01188889206880931,
"acc_norm": 0.31747066492829207,
"acc_norm_stderr": 0.01188889206880931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877753,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877753
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310936,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310936
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015018,
"mc2": 0.3874927202329178,
"mc2_stderr": 0.013639417434393192
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
},
"harness|gsm8k|5": {
"acc": 0.11144806671721001,
"acc_stderr": 0.008668021353794433
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FreedomIntelligence__AceGPT-7B | [
"region:us"
] | 2024-02-11T10:10:43+00:00 | {"pretty_name": "Evaluation run of FreedomIntelligence/AceGPT-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FreedomIntelligence/AceGPT-7B](https://huggingface.co/FreedomIntelligence/AceGPT-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FreedomIntelligence__AceGPT-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T10:08:53.016529](https://huggingface.co/datasets/open-llm-leaderboard/details_FreedomIntelligence__AceGPT-7B/blob/main/results_2024-02-11T10-08-53.016529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4331306075709672,\n \"acc_stderr\": 0.03424883157582962,\n \"acc_norm\": 0.4376724571110185,\n \"acc_norm_stderr\": 0.03503717854163451,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015018,\n \"mc2\": 0.3874927202329178,\n \"mc2_stderr\": 0.013639417434393192\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956952,\n \"acc_norm\": 0.5358361774744027,\n \"acc_norm_stderr\": 0.01457381366473572\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5746863174666401,\n \"acc_stderr\": 0.004933800927560531,\n \"acc_norm\": 0.7754431388169687,\n \"acc_norm_stderr\": 0.0041643733628592815\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731837,\n \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731837\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416908,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416908\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.037082846624165444,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.037082846624165444\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730585,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730585\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n \"acc_stderr\": 0.028327743091561074,\n \"acc_norm\": 0.45483870967741935,\n \"acc_norm_stderr\": 0.028327743091561074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.03903698647748441,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5743119266055046,\n \"acc_stderr\": 0.0211992359724708,\n \"acc_norm\": 0.5743119266055046,\n \"acc_norm_stderr\": 0.0211992359724708\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046948,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046948\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.484304932735426,\n \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.484304932735426,\n \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.031937057262002924,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.031937057262002924\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6040868454661558,\n \"acc_stderr\": 0.017488247006979266,\n \"acc_norm\": 0.6040868454661558,\n \"acc_norm_stderr\": 0.017488247006979266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.026788811931562757,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.026788811931562757\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n \"acc_stderr\": 0.014776765066438885,\n \"acc_norm\": 0.2659217877094972,\n \"acc_norm_stderr\": 0.014776765066438885\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4542483660130719,\n \"acc_stderr\": 0.02850980780262656,\n \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.02850980780262656\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4983922829581994,\n \"acc_stderr\": 0.02839794490780661,\n \"acc_norm\": 0.4983922829581994,\n \"acc_norm_stderr\": 0.02839794490780661\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.02774431344337654,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02774431344337654\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878638,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31747066492829207,\n \"acc_stderr\": 0.01188889206880931,\n \"acc_norm\": 0.31747066492829207,\n \"acc_norm_stderr\": 0.01188889206880931\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877753,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877753\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.0200176292142131,\n \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.0200176292142131\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310936,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310936\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015018,\n \"mc2\": 0.3874927202329178,\n \"mc2_stderr\": 0.013639417434393192\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11144806671721001,\n \"acc_stderr\": 0.008668021353794433\n }\n}\n```", "repo_url": "https://huggingface.co/FreedomIntelligence/AceGPT-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|arc:challenge|25_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|gsm8k|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hellaswag|10_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T10-08-53.016529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["**/details_harness|winogrande|5_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T10-08-53.016529.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T10_08_53.016529", "path": ["results_2024-02-11T10-08-53.016529.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T10-08-53.016529.parquet"]}]}]} | 2024-02-11T10:11:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FreedomIntelligence/AceGPT-7B
Dataset automatically created during the evaluation run of model FreedomIntelligence/AceGPT-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T10:08:53.016529(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FreedomIntelligence/AceGPT-7B\n\n\n\nDataset automatically created during the evaluation run of model FreedomIntelligence/AceGPT-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T10:08:53.016529(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FreedomIntelligence/AceGPT-7B\n\n\n\nDataset automatically created during the evaluation run of model FreedomIntelligence/AceGPT-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T10:08:53.016529(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e028a58c190f8655616689c791de1ad64c8d4b59 |
## Introduction
This is a datamix created for the [The Learning Agency Lab - PII Data Detection](https://www.kaggle.com/competitions/pii-detection-removal-from-educational-data/overview) Kaggle competition. It supports training / evaluation of models that can identify a set of PII types in texts. Specifically the PII types are:
- `NAME_STUDENT` - The full or partial name of a student that is not necessarily the author of the essay. - This excludes instructors, authors, and other person names.
- `EMAIL` - A student’s email address.
- `USERNAME` - A student's username on any platform.
- `ID_NUM` - A number or sequence of characters that could be used to identify a student, such as a student ID or a social security number.
- `PHONE_NUM` - A phone number associated with a student.
- `URL_PERSONAL` - A URL that might be used to identify a student.
- `STREET_ADDRESS` - A full or partial street address that is associated with the student, such as their home address.
## Sources
The datamix comprises of the following contributions:
- [Dataset](https://www.kaggle.com/datasets/nbroad/pii-dd-mistral-generated) by Nicholas: [More data - 2355 essays generated by the best open source model - Mixtral 8x7b ⚡](https://www.kaggle.com/competitions/pii-detection-removal-from-educational-data/discussion/472221)
- [Dataset](https://www.kaggle.com/datasets/pjmathematician/pii-detection-dataset-gpt) by PJMathematician: [2000 AI Created PII detection External dataset](https://www.kaggle.com/competitions/pii-detection-removal-from-educational-data/discussion/470921)
- [Dataset](https://www.kaggle.com/datasets/alejopaullier/pii-external-dataset/data) by Moth: [New Dataset: +4400 external generated texts 🚀🚀🚀](https://www.kaggle.com/competitions/pii-detection-removal-from-educational-data/discussion/469493)
- [Dataset](https://www.kaggle.com/datasets/valentinwerner/pii-label-specific-data) by Valentin: [4367 new essays - generated to promote diversity
](https://www.kaggle.com/competitions/pii-detection-removal-from-educational-data/discussion/475297) | rbiswasfc/pii-datamix | [
"task_categories:token-classification",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-11T10:34:39+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["token-classification"]} | 2024-02-11T10:35:34+00:00 | [] | [
"en"
] | TAGS
#task_categories-token-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
## Introduction
This is a datamix created for the The Learning Agency Lab - PII Data Detection Kaggle competition. It supports training / evaluation of models that can identify a set of PII types in texts. Specifically the PII types are:
- 'NAME_STUDENT' - The full or partial name of a student that is not necessarily the author of the essay. - This excludes instructors, authors, and other person names.
- 'EMAIL' - A student’s email address.
- 'USERNAME' - A student's username on any platform.
- 'ID_NUM' - A number or sequence of characters that could be used to identify a student, such as a student ID or a social security number.
- 'PHONE_NUM' - A phone number associated with a student.
- 'URL_PERSONAL' - A URL that might be used to identify a student.
- 'STREET_ADDRESS' - A full or partial street address that is associated with the student, such as their home address.
## Sources
The datamix comprises of the following contributions:
- Dataset by Nicholas: More data - 2355 essays generated by the best open source model - Mixtral 8x7b
- Dataset by PJMathematician: 2000 AI Created PII detection External dataset
- Dataset by Moth: New Dataset: +4400 external generated texts
- Dataset by Valentin: 4367 new essays - generated to promote diversity
| [
"## Introduction\nThis is a datamix created for the The Learning Agency Lab - PII Data Detection Kaggle competition. It supports training / evaluation of models that can identify a set of PII types in texts. Specifically the PII types are:\n\n- 'NAME_STUDENT' - The full or partial name of a student that is not necessarily the author of the essay. - This excludes instructors, authors, and other person names.\n- 'EMAIL' - A student’s email address.\n- 'USERNAME' - A student's username on any platform.\n- 'ID_NUM' - A number or sequence of characters that could be used to identify a student, such as a student ID or a social security number.\n- 'PHONE_NUM' - A phone number associated with a student.\n- 'URL_PERSONAL' - A URL that might be used to identify a student.\n- 'STREET_ADDRESS' - A full or partial street address that is associated with the student, such as their home address.",
"## Sources\nThe datamix comprises of the following contributions:\n\n - Dataset by Nicholas: More data - 2355 essays generated by the best open source model - Mixtral 8x7b \n - Dataset by PJMathematician: 2000 AI Created PII detection External dataset\n - Dataset by Moth: New Dataset: +4400 external generated texts \n - Dataset by Valentin: 4367 new essays - generated to promote diversity"
] | [
"TAGS\n#task_categories-token-classification #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n",
"## Introduction\nThis is a datamix created for the The Learning Agency Lab - PII Data Detection Kaggle competition. It supports training / evaluation of models that can identify a set of PII types in texts. Specifically the PII types are:\n\n- 'NAME_STUDENT' - The full or partial name of a student that is not necessarily the author of the essay. - This excludes instructors, authors, and other person names.\n- 'EMAIL' - A student’s email address.\n- 'USERNAME' - A student's username on any platform.\n- 'ID_NUM' - A number or sequence of characters that could be used to identify a student, such as a student ID or a social security number.\n- 'PHONE_NUM' - A phone number associated with a student.\n- 'URL_PERSONAL' - A URL that might be used to identify a student.\n- 'STREET_ADDRESS' - A full or partial street address that is associated with the student, such as their home address.",
"## Sources\nThe datamix comprises of the following contributions:\n\n - Dataset by Nicholas: More data - 2355 essays generated by the best open source model - Mixtral 8x7b \n - Dataset by PJMathematician: 2000 AI Created PII detection External dataset\n - Dataset by Moth: New Dataset: +4400 external generated texts \n - Dataset by Valentin: 4367 new essays - generated to promote diversity"
] |
5d86124cc665da831803785ae4fab880feb2a07c |
<div align="center">
<img width="640" alt="jigarsiddhpura/IPD" src="https://huggingface.co/datasets/jigarsiddhpura/IPD/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['dry-person', 'object', 'wet-swimmer']
```
### Number of Images
```json
{'test': 77, 'valid': 153, 'train': 1608}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("jigarsiddhpura/IPD", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/resq/tiny-people-detection-rpi/dataset/1](https://universe.roboflow.com/resq/tiny-people-detection-rpi/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ tiny-people-detection-rpi_dataset,
title = { Tiny people detection RPI Dataset },
type = { Open Source Dataset },
author = { ResQ },
howpublished = { \\url{ https://universe.roboflow.com/resq/tiny-people-detection-rpi } },
url = { https://universe.roboflow.com/resq/tiny-people-detection-rpi },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { sep },
note = { visited on 2024-02-11 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on February 10, 2024 at 7:28 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 1838 images.
People are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
The following augmentation was applied to create 3 versions of each source image:
* Randomly crop between 0 and 67 percent of the image
* Salt and pepper noise was applied to 4 percent of pixels
The following transformations were applied to the bounding boxes of each image:
* Random shear of between -5° to +5° horizontally and -5° to +5° vertically
| jigarsiddhpura/IPD | [
"task_categories:object-detection",
"roboflow",
"roboflow2huggingface",
"region:us"
] | 2024-02-11T10:54:24+00:00 | {"task_categories": ["object-detection"], "tags": ["roboflow", "roboflow2huggingface"]} | 2024-02-12T00:38:27+00:00 | [] | [] | TAGS
#task_categories-object-detection #roboflow #roboflow2huggingface #region-us
|
<div align="center">
<img width="640" alt="jigarsiddhpura/IPD" src="URL
</div>
### Dataset Labels
### Number of Images
### How to Use
- Install datasets:
- Load the dataset:
### Roboflow Dataset Page
URL
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via URL on February 10, 2024 at 7:28 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit URL
To find over 100k other datasets and pre-trained models, visit URL
The dataset includes 1838 images.
People are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
The following augmentation was applied to create 3 versions of each source image:
* Randomly crop between 0 and 67 percent of the image
* Salt and pepper noise was applied to 4 percent of pixels
The following transformations were applied to the bounding boxes of each image:
* Random shear of between -5° to +5° horizontally and -5° to +5° vertically
| [
"### Dataset Labels",
"### Number of Images",
"### How to Use\n\n- Install datasets:\n\n\n\n- Load the dataset:",
"### Roboflow Dataset Page\nURL",
"### License\nCC BY 4.0",
"### Dataset Summary\nThis dataset was exported via URL on February 10, 2024 at 7:28 AM GMT\n\nRoboflow is an end-to-end computer vision platform that helps you\n* collaborate with your team on computer vision projects\n* collect & organize images\n* understand and search unstructured image data\n* annotate, and create datasets\n* export, train, and deploy computer vision models\n* use active learning to improve your dataset over time\n\nFor state of the art Computer Vision training notebooks you can use with this dataset,\nvisit URL\n\nTo find over 100k other datasets and pre-trained models, visit URL\n\nThe dataset includes 1838 images.\nPeople are annotated in COCO format.\n\nThe following pre-processing was applied to each image:\n* Auto-orientation of pixel data (with EXIF-orientation stripping)\n* Resize to 640x640 (Stretch)\n\nThe following augmentation was applied to create 3 versions of each source image:\n* Randomly crop between 0 and 67 percent of the image\n* Salt and pepper noise was applied to 4 percent of pixels\n\nThe following transformations were applied to the bounding boxes of each image:\n* Random shear of between -5° to +5° horizontally and -5° to +5° vertically"
] | [
"TAGS\n#task_categories-object-detection #roboflow #roboflow2huggingface #region-us \n",
"### Dataset Labels",
"### Number of Images",
"### How to Use\n\n- Install datasets:\n\n\n\n- Load the dataset:",
"### Roboflow Dataset Page\nURL",
"### License\nCC BY 4.0",
"### Dataset Summary\nThis dataset was exported via URL on February 10, 2024 at 7:28 AM GMT\n\nRoboflow is an end-to-end computer vision platform that helps you\n* collaborate with your team on computer vision projects\n* collect & organize images\n* understand and search unstructured image data\n* annotate, and create datasets\n* export, train, and deploy computer vision models\n* use active learning to improve your dataset over time\n\nFor state of the art Computer Vision training notebooks you can use with this dataset,\nvisit URL\n\nTo find over 100k other datasets and pre-trained models, visit URL\n\nThe dataset includes 1838 images.\nPeople are annotated in COCO format.\n\nThe following pre-processing was applied to each image:\n* Auto-orientation of pixel data (with EXIF-orientation stripping)\n* Resize to 640x640 (Stretch)\n\nThe following augmentation was applied to create 3 versions of each source image:\n* Randomly crop between 0 and 67 percent of the image\n* Salt and pepper noise was applied to 4 percent of pixels\n\nThe following transformations were applied to the bounding boxes of each image:\n* Random shear of between -5° to +5° horizontally and -5° to +5° vertically"
] |
f6aba09277e43917208b04814e25b8c5f627be3b |
# 🎨 DALL•E 3 Images Dataset
This is datase with images made by Dalle3.
## Dataset parameters
1. **Count of images**: 3310
2. **Zip file with dataset**: True
3. **Captions with images**: False
## License
License for this dataset: [MIT](https://www.mit.edu/~amini/LICENSE.md)
## Use in *datasets*
1. ```bash
pip install -q datasets
```
2. ```py
from datasets import load_dataset
dataset = load_dataset(
"ehristoforu/dalle-3-images",
revision="main"
)
```
#### *Enjoy with this dataset!* | ehristoforu/dalle-3-images | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"size_categories:1K<n<10K",
"license:mit",
"dalle-3",
"dall-e",
"dalle-images",
"images",
"croissant",
"region:us"
] | 2024-02-11T11:30:27+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-to-image", "image-to-image"], "tags": ["dalle-3", "dall-e", "dalle-images", "images", "croissant"]} | 2024-02-11T13:57:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #size_categories-1K<n<10K #license-mit #dalle-3 #dall-e #dalle-images #images #croissant #region-us
|
# DALL•E 3 Images Dataset
This is datase with images made by Dalle3.
## Dataset parameters
1. Count of images: 3310
2. Zip file with dataset: True
3. Captions with images: False
## License
License for this dataset: MIT
## Use in *datasets*
1.
2.
#### *Enjoy with this dataset!* | [
"# DALL•E 3 Images Dataset\n\nThis is datase with images made by Dalle3.",
"## Dataset parameters\n1. Count of images: 3310\n2. Zip file with dataset: True\n3. Captions with images: False",
"## License\n\nLicense for this dataset: MIT",
"## Use in *datasets*\n\n1. \n2.",
"#### *Enjoy with this dataset!*"
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #size_categories-1K<n<10K #license-mit #dalle-3 #dall-e #dalle-images #images #croissant #region-us \n",
"# DALL•E 3 Images Dataset\n\nThis is datase with images made by Dalle3.",
"## Dataset parameters\n1. Count of images: 3310\n2. Zip file with dataset: True\n3. Captions with images: False",
"## License\n\nLicense for this dataset: MIT",
"## Use in *datasets*\n\n1. \n2.",
"#### *Enjoy with this dataset!*"
] |
23e4c3139fb063a3ca0b2eae42f7430607f2af39 |
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T12:02:04.707768](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO/blob/main/results_2024-02-11T12-02-04.707768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083454936984162,
"acc_stderr": 0.033140017189034275,
"acc_norm": 0.6127945476017843,
"acc_norm_stderr": 0.0338104933555728,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5791139392635098,
"mc2_stderr": 0.015266138543062658
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558903,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6436964748058156,
"acc_stderr": 0.004779276329704048,
"acc_norm": 0.8383788090021908,
"acc_norm_stderr": 0.0036735065123709547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.0250107491161376,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.0250107491161376
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646826,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646826
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694827,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694827
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039504,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039504
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5791139392635098,
"mc2_stderr": 0.015266138543062658
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
},
"harness|gsm8k|5": {
"acc": 0.4177407126611069,
"acc_stderr": 0.013584820638504832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO | [
"region:us"
] | 2024-02-11T12:04:27+00:00 | {"pretty_name": "Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T12:02:04.707768](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO/blob/main/results_2024-02-11T12-02-04.707768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083454936984162,\n \"acc_stderr\": 0.033140017189034275,\n \"acc_norm\": 0.6127945476017843,\n \"acc_norm_stderr\": 0.0338104933555728,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5791139392635098,\n \"mc2_stderr\": 0.015266138543062658\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558903,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6436964748058156,\n \"acc_stderr\": 0.004779276329704048,\n \"acc_norm\": 0.8383788090021908,\n \"acc_norm_stderr\": 0.0036735065123709547\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.0250107491161376,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.0250107491161376\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646826,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646826\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694827,\n \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694827\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039504,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039504\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.029812630701569743,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.029812630701569743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5791139392635098,\n \"mc2_stderr\": 0.015266138543062658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4177407126611069,\n \"acc_stderr\": 0.013584820638504832\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["**/details_harness|winogrande|5_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T12-02-04.707768.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T12_02_04.707768", "path": ["results_2024-02-11T12-02-04.707768.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T12-02-04.707768.parquet"]}]}]} | 2024-02-11T12:04:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO
Dataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T12:02:04.707768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:02:04.707768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:02:04.707768(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d01bcf4086b29bad3c4fc7879f736b2cc18b6b03 |
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-SFT](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T12:06:32.425794](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT/blob/main/results_2024-02-11T12-06-32.425794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6080373697627688,
"acc_stderr": 0.03304606242674154,
"acc_norm": 0.6127245280093403,
"acc_norm_stderr": 0.03371908018229196,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5414084379491539,
"mc2_stderr": 0.01540179961111594
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.014555949760496442,
"acc_norm": 0.5784982935153583,
"acc_norm_stderr": 0.014430197069326028
},
"harness|hellaswag|10": {
"acc": 0.6364270065723959,
"acc_stderr": 0.00480044639765335,
"acc_norm": 0.8312089225253934,
"acc_norm_stderr": 0.003738017734037877
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.01738141556360868,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.01738141556360868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039504,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039504
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597556,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597556
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593511,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593511
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006972,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006972
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.015638440380241488,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.015638440380241488
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598808,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598808
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248818,
"mc2": 0.5414084379491539,
"mc2_stderr": 0.01540179961111594
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126732
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.013460852357095656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT | [
"region:us"
] | 2024-02-11T12:08:51+00:00 | {"pretty_name": "Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-SFT](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T12:06:32.425794](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT/blob/main/results_2024-02-11T12-06-32.425794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6080373697627688,\n \"acc_stderr\": 0.03304606242674154,\n \"acc_norm\": 0.6127245280093403,\n \"acc_norm_stderr\": 0.03371908018229196,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5414084379491539,\n \"mc2_stderr\": 0.01540179961111594\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.014555949760496442,\n \"acc_norm\": 0.5784982935153583,\n \"acc_norm_stderr\": 0.014430197069326028\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6364270065723959,\n \"acc_stderr\": 0.00480044639765335,\n \"acc_norm\": 0.8312089225253934,\n \"acc_norm_stderr\": 0.003738017734037877\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.01738141556360868,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.01738141556360868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039504,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039504\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597556,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597556\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593511,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593511\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006972,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006972\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n \"acc_stderr\": 0.015638440380241488,\n \"acc_norm\": 0.3229050279329609,\n \"acc_norm_stderr\": 0.015638440380241488\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.012683972513598808,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.012683972513598808\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248818,\n \"mc2\": 0.5414084379491539,\n \"mc2_stderr\": 0.01540179961111594\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126732\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \"acc_stderr\": 0.013460852357095656\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-06-32.425794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["**/details_harness|winogrande|5_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T12-06-32.425794.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T12_06_32.425794", "path": ["results_2024-02-11T12-06-32.425794.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T12-06-32.425794.parquet"]}]}]} | 2024-02-11T12:09:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT
Dataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T12:06:32.425794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:06:32.425794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-SFT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:06:32.425794(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
62bdebd5e3002e630aadd7ad3c0735427cc1ac2c | # Open Assistant 2 Top-1 Thai
## Dataset Details
### Dataset Description
A top-1 Thai dataset taken from the top scoring https://huggingface.co/datasets/OpenAssistant/oasst2 conversations. Saved in HF Chat format.
**License**: Apache 2.0
Script: [https://github.com/wannaphong/deep_4_all/tree/main/datasets/oasst](https://github.com/wannaphong/deep_4_all/tree/main/datasets/oasst)
## Dataset Structure
We structure the dataset using the format commonly used as input into Hugging Face Chat Templates:
```
[
{'content': 'ยุงที่แอฟริกาบินหรือเดิน', 'role': 'user'},
{'content': 'บิน เพราะยุงทั่วโลกต่างบินเพื่อหาอาหาร', 'role': 'assistant'}
]
``` | pythainlp/oasst2_thai_top1_chat_format | [
"task_categories:conversational",
"task_categories:question-answering",
"size_categories:n<1K",
"language:th",
"license:apache-2.0",
"region:us"
] | 2024-02-11T12:12:46+00:00 | {"language": ["th"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["conversational", "question-answering"], "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "langs", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 238601, "num_examples": 167}], "download_size": 96701, "dataset_size": 238601}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T12:15:33+00:00 | [] | [
"th"
] | TAGS
#task_categories-conversational #task_categories-question-answering #size_categories-n<1K #language-Thai #license-apache-2.0 #region-us
| # Open Assistant 2 Top-1 Thai
## Dataset Details
### Dataset Description
A top-1 Thai dataset taken from the top scoring URL conversations. Saved in HF Chat format.
License: Apache 2.0
Script: URL
## Dataset Structure
We structure the dataset using the format commonly used as input into Hugging Face Chat Templates:
| [
"# Open Assistant 2 Top-1 Thai",
"## Dataset Details",
"### Dataset Description\n\nA top-1 Thai dataset taken from the top scoring URL conversations. Saved in HF Chat format.\n\nLicense: Apache 2.0\n\nScript: URL",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates:"
] | [
"TAGS\n#task_categories-conversational #task_categories-question-answering #size_categories-n<1K #language-Thai #license-apache-2.0 #region-us \n",
"# Open Assistant 2 Top-1 Thai",
"## Dataset Details",
"### Dataset Description\n\nA top-1 Thai dataset taken from the top scoring URL conversations. Saved in HF Chat format.\n\nLicense: Apache 2.0\n\nScript: URL",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates:"
] |
b6db1a0b87eb1cd9b30062f513a942f3b9e75da5 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mdeberta
Dataset: xlwa_en-it
Dataset path = /home/pgajo/working/food/data/XL-WA/data
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/xlwa_en-it_mdeberta | [
"region:us"
] | 2024-02-11T12:40:53+00:00 | {} | 2024-02-11T12:41:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mdeberta
Dataset: xlwa_en-it
Dataset path = /home/pgajo/working/food/data/XL-WA/data
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: xlwa_en-it\n\n Dataset path = /home/pgajo/working/food/data/XL-WA/data",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: xlwa_en-it\n\n Dataset path = /home/pgajo/working/food/data/XL-WA/data",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8b8b007c5ed7c70b8a5452f1f83de61449087d49 | ## K-QA
We are excited to announce the release of K-QA!
This benchmark consists of two parts: a medium-scale corpus of diverse real-world medical inquiries written by patients on K Health (an AI-driven clinical platform), and a subset of carefully crafted answers annotated by a team of in-house medical experts.
The dataset comprises 201 questions and answers, containing over 1,589 ground-truth statements.
Additionally, we provide 1,212 authentic patient questions.
For further details, refer to the [paper](https://arxiv.org/abs/2401.14493).
The recommended evaluation scheme for fine-grained evaluation can be found [here](https://github.com/Itaymanes/K-QA)
#### Cite Us
```markdown
@misc{manes2024kqa,
title={K-QA: A Real-World Medical Q&A Benchmark},
author={Itay Manes and Naama Ronn and David Cohen and Ran Ilan Ber and Zehavi Horowitz-Kugler and Gabriel Stanovsky},
year={2024},
eprint={2401.14493},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| Itaykhealth/K-QA | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"medical",
"arxiv:2401.14493",
"region:us"
] | 2024-02-11T12:43:59+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation", "conversational"], "pretty_name": "K-QA", "configs": [{"config_name": "default", "data_files": [{"split": "questions_with_answers", "path": "questions_w_answers.jsonl"}, {"split": "questions", "path": "questions.jsonl"}]}], "tags": ["medical"]} | 2024-02-11T13:14:03+00:00 | [
"2401.14493"
] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-1K<n<10K #language-English #license-mit #medical #arxiv-2401.14493 #region-us
| ## K-QA
We are excited to announce the release of K-QA!
This benchmark consists of two parts: a medium-scale corpus of diverse real-world medical inquiries written by patients on K Health (an AI-driven clinical platform), and a subset of carefully crafted answers annotated by a team of in-house medical experts.
The dataset comprises 201 questions and answers, containing over 1,589 ground-truth statements.
Additionally, we provide 1,212 authentic patient questions.
For further details, refer to the paper.
The recommended evaluation scheme for fine-grained evaluation can be found here
#### Cite Us
| [
"## K-QA\n\nWe are excited to announce the release of K-QA!\n\nThis benchmark consists of two parts: a medium-scale corpus of diverse real-world medical inquiries written by patients on K Health (an AI-driven clinical platform), and a subset of carefully crafted answers annotated by a team of in-house medical experts.\n\nThe dataset comprises 201 questions and answers, containing over 1,589 ground-truth statements.\nAdditionally, we provide 1,212 authentic patient questions.\n\n\nFor further details, refer to the paper.\n\nThe recommended evaluation scheme for fine-grained evaluation can be found here",
"#### Cite Us"
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #task_categories-conversational #size_categories-1K<n<10K #language-English #license-mit #medical #arxiv-2401.14493 #region-us \n",
"## K-QA\n\nWe are excited to announce the release of K-QA!\n\nThis benchmark consists of two parts: a medium-scale corpus of diverse real-world medical inquiries written by patients on K Health (an AI-driven clinical platform), and a subset of carefully crafted answers annotated by a team of in-house medical experts.\n\nThe dataset comprises 201 questions and answers, containing over 1,589 ground-truth statements.\nAdditionally, we provide 1,212 authentic patient questions.\n\n\nFor further details, refer to the paper.\n\nThe recommended evaluation scheme for fine-grained evaluation can be found here",
"#### Cite Us"
] |
77566b35bc4d2872c78053f0720cbef790951eb4 |
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the multimodal **Qwen AI** project:
- [Qwen](https://github.com/QwenLM/Qwen)
- [Qwen Agent](https://github.com/QwenLM/Qwen-Agent)
- [Qwen VL Chat](https://github.com/QwenLM/Qwen-VL)
- [Qwen Audio](https://github.com/QwenLM/Qwen-Audio)
This dataset is the 2024-02-10 update for the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1070671
- Size: 1.8 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1274
### How to use the datasets
#### Load Qwen Agent Schema
```python
from datasets import load_dataset
ds_name = (
"matlok"
"/"
"python-text-copilot-training-"
"instruct-ai-research-"
"2024-02-10"
)
dc = "qwen_agent"
ds = load_dataset(ds_name, dc, verification_mode="no_checks")
print(f"ds={ds_name} dataset_config={dc} has {len(ds['view_schema']['file_path'])} unique python modules")
```
```
ds=matlok/python-text-copilot-training-instruct-ai-research-2024-02-10 dataset_config=qwen_agent has 123 unique python modules
```
### Schema
The instruction alpaca text with yaml response is in the **desc** column:
```json
{
"active": "bool",
"args": "string",
"args_len": "float64",
"audio_file": "string",
"audio_path": "string",
"class_bases": "string",
"class_name": "string",
"code": "string",
"code_len": "float64",
"desc": "string",
"desc_docstr": "string",
"desc_docstr_len": "float64",
"desc_len": "int64",
"docstr": "string",
"docstr_len": "int64",
"file_path": "string",
"file_type": "string",
"function_names": "string",
"gen_bytes": "int64",
"gen_data_type": "string",
"gen_mode": "string",
"gen_size": "int64",
"gen_valid": "bool",
"height": "int64",
"image_file": "string",
"image_path": "string",
"method_names": "string",
"name": "string",
"num_all_bases": "int64",
"num_bases": "int64",
"num_classes": "int64",
"num_functions": "float64",
"num_imports": "int64",
"num_methods": "float64",
"prompts": "string",
"raises": "string",
"raises_len": "float64",
"recsize": "int64",
"repo": "string",
"returns": "string",
"returns_len": "float64",
"size": "int64",
"src_object": "string",
"total_objects": "int64",
"usage": "string",
"usages": "string",
"width": "int64"
}
```
| matlok/python-text-copilot-training-instruct-ai-research-2024-02-10 | [
"task_categories:text-generation",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:1M<n<10M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"coding",
"task",
"prompt",
"response",
"yaml",
"region:us"
] | 2024-02-11T12:44:43+00:00 | {"license": ["other"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"], "task_ids": ["parsing"], "pretty_name": "2024-02-10 - python copilot instructions on how to code using alpaca and yaml", "dataset_info": [{"config_name": "qwen_agent", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "qwen_agent", "data_files": [{"split": "view_schema", "path": "schema/train-0022-qwen-agent-qwen_agent.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "coding", "task", "prompt", "response", "yaml"]} | 2024-02-12T04:47:37+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us
|
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the multimodal Qwen AI project:
- Qwen
- Qwen Agent
- Qwen VL Chat
- Qwen Audio
This dataset is the 2024-02-10 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1070671
- Size: 1.8 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1274
### How to use the datasets
#### Load Qwen Agent Schema
### Schema
The instruction alpaca text with yaml response is in the desc column:
| [
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the multimodal Qwen AI project:\n\n- Qwen\n- Qwen Agent\n- Qwen VL Chat\n- Qwen Audio\n\nThis dataset is the 2024-02-10 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1070671\n- Size: 1.8 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1274",
"### How to use the datasets",
"#### Load Qwen Agent Schema",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us \n",
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the multimodal Qwen AI project:\n\n- Qwen\n- Qwen Agent\n- Qwen VL Chat\n- Qwen Audio\n\nThis dataset is the 2024-02-10 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1070671\n- Size: 1.8 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1274",
"### How to use the datasets",
"#### Load Qwen Agent Schema",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] |
b8b85adae451b19899e063ee22ed73d3af91d022 |
# Dataset Card for Evaluation run of yam-peleg/Experiment4-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment4-7B](https://huggingface.co/yam-peleg/Experiment4-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment4-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T12:47:14.139387](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment4-7B/blob/main/results_2024-02-11T12-47-14.139387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545438799099946,
"acc_stderr": 0.03201109405695293,
"acc_norm": 0.6554330760311358,
"acc_norm_stderr": 0.032658616723143415,
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7039319058753165,
"mc2_stderr": 0.014998717036441298
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244482,
"acc_norm": 0.7218430034129693,
"acc_norm_stderr": 0.013094469919538805
},
"harness|hellaswag|10": {
"acc": 0.7112129057956582,
"acc_stderr": 0.004522725412556956,
"acc_norm": 0.8809002190798646,
"acc_norm_stderr": 0.003232439139881551
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400506,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45027932960893857,
"acc_stderr": 0.016639615236845814,
"acc_norm": 0.45027932960893857,
"acc_norm_stderr": 0.016639615236845814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7039319058753165,
"mc2_stderr": 0.014998717036441298
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019815
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266637
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yam-peleg__Experiment4-7B | [
"region:us"
] | 2024-02-11T12:49:30+00:00 | {"pretty_name": "Evaluation run of yam-peleg/Experiment4-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yam-peleg/Experiment4-7B](https://huggingface.co/yam-peleg/Experiment4-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment4-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T12:47:14.139387](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment4-7B/blob/main/results_2024-02-11T12-47-14.139387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545438799099946,\n \"acc_stderr\": 0.03201109405695293,\n \"acc_norm\": 0.6554330760311358,\n \"acc_norm_stderr\": 0.032658616723143415,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7039319058753165,\n \"mc2_stderr\": 0.014998717036441298\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244482,\n \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538805\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7112129057956582,\n \"acc_stderr\": 0.004522725412556956,\n \"acc_norm\": 0.8809002190798646,\n \"acc_norm_stderr\": 0.003232439139881551\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400506,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45027932960893857,\n \"acc_stderr\": 0.016639615236845814,\n \"acc_norm\": 0.45027932960893857,\n \"acc_norm_stderr\": 0.016639615236845814\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7039319058753165,\n \"mc2_stderr\": 0.014998717036441298\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019815\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \"acc_stderr\": 0.013264282030266637\n }\n}\n```", "repo_url": "https://huggingface.co/yam-peleg/Experiment4-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["**/details_harness|winogrande|5_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T12-47-14.139387.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T12_47_14.139387", "path": ["results_2024-02-11T12-47-14.139387.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T12-47-14.139387.parquet"]}]}]} | 2024-02-11T12:49:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yam-peleg/Experiment4-7B
Dataset automatically created during the evaluation run of model yam-peleg/Experiment4-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T12:47:14.139387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yam-peleg/Experiment4-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment4-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:47:14.139387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yam-peleg/Experiment4-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment4-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T12:47:14.139387(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e21945e1b9bd9d38887be015ad51df6a1148518d |
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T13:02:55.830318](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties/blob/main/results_2024-02-11T13-02-55.830318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445006408453816,
"acc_stderr": 0.03221707902550851,
"acc_norm": 0.6435699567376953,
"acc_norm_stderr": 0.03289233804602633,
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.7046726045086744,
"mc2_stderr": 0.014909807031624017
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.01325001257939344
},
"harness|hellaswag|10": {
"acc": 0.7170882294363673,
"acc_stderr": 0.004494934025462338,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.00316824935188931
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834843,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834843
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225419,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225419
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.7046726045086744,
"mc2_stderr": 0.014909807031624017
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties | [
"region:us"
] | 2024-02-11T13:05:13+00:00 | {"pretty_name": "Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T13:02:55.830318](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties/blob/main/results_2024-02-11T13-02-55.830318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445006408453816,\n \"acc_stderr\": 0.03221707902550851,\n \"acc_norm\": 0.6435699567376953,\n \"acc_norm_stderr\": 0.03289233804602633,\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.7046726045086744,\n \"mc2_stderr\": 0.014909807031624017\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7170882294363673,\n \"acc_stderr\": 0.004494934025462338,\n \"acc_norm\": 0.8862776339374626,\n \"acc_norm_stderr\": 0.00316824935188931\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834843,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834843\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225419,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225419\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.7046726045086744,\n \"mc2_stderr\": 0.014909807031624017\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \"acc_stderr\": 0.012625423152283034\n }\n}\n```", "repo_url": "https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|arc:challenge|25_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|gsm8k|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hellaswag|10_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["**/details_harness|winogrande|5_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T13-02-55.830318.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T13_02_55.830318", "path": ["results_2024-02-11T13-02-55.830318.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T13-02-55.830318.parquet"]}]}]} | 2024-02-11T13:05:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties
Dataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T13:02:55.830318(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T13:02:55.830318(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T13:02:55.830318(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
157e0e6d01e316875dbbc13b8b24370c27d0e244 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | hk742/vaya-gpt-flagged-answers | [
"region:us"
] | 2024-02-11T13:06:56+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data.csv"}]}]} | 2024-02-12T04:49:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1368006da7f582a9a28298d05c9915d7f9ef4ee7 |
# About data
This dataset consists of 1 million pi digits i.e., 31415926535...
# Task
You can use this dataset for
* Time series forecasting
* Time series classification | pkr7098/pi | [
"task_categories:time-series-forecasting",
"size_categories:100K<n<1M",
"license:mit",
"region:us"
] | 2024-02-11T13:24:11+00:00 | {"license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["time-series-forecasting"], "pretty_name": "pi"} | 2024-02-11T13:37:28+00:00 | [] | [] | TAGS
#task_categories-time-series-forecasting #size_categories-100K<n<1M #license-mit #region-us
|
# About data
This dataset consists of 1 million pi digits i.e., 31415926535...
# Task
You can use this dataset for
* Time series forecasting
* Time series classification | [
"# About data\n\nThis dataset consists of 1 million pi digits i.e., 31415926535...",
"# Task\n\nYou can use this dataset for \n* Time series forecasting\n* Time series classification"
] | [
"TAGS\n#task_categories-time-series-forecasting #size_categories-100K<n<1M #license-mit #region-us \n",
"# About data\n\nThis dataset consists of 1 million pi digits i.e., 31415926535...",
"# Task\n\nYou can use this dataset for \n* Time series forecasting\n* Time series classification"
] |
77c61472aa1f82fefa5190d3e6fa6c0bb0a5dd0d |
# Learning to Edit: Aligning LLMs with Knowledge Editing
[](https://github.com/YJiangcm/LTE)
We introduces a novel Learning to Edit (**LTE**) framework for effective and efficient knowledge editing of large language models (LLMs).
our LTE framework focuses on teaching LLMs to **apply** updated knowledge into input questions, inspired by the philosophy of "_Teach a man to fish_."
As the below figure shows, LTE features a two-phase process: (i) the **Alignment Phase**, which fine-tunes LLMs on a meticulously curated parallel dataset to make reliable, in-scope edits while preserving out-of-scope information and linguistic proficiency; and (ii) the **Inference Phase**, which employs a retrieval-based mechanism for real-time and mass knowledge editing.
<p align="center">
<br>
<img src="https://github.com/YJiangcm/LTE/raw/master/figures/method.jpg" width="1200"/>
<br>
</p>
## ⚙️ How to implement
### Requirements
**Note: Please use Python 3.10+ for LTE.** To get started, simply install conda and run:
```
conda create -n LTE python=3.10
conda activate LTE
conda install pytorch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 pytorch-cuda=12.1 -c pytorch -c nvidia
pip install -r requirements.txt
```
### 1. Alignment Phrase
Firstly, please download the training data of LTE from [HuggingFace](https://huggingface.co/datasets/YuxinJiang/LTE_train_data) and put it into [data/](data/).
#### LLaMA2-Chat-7B
The code is based on [FastChat](https://github.com/lm-sys/FastChat). Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.
```bash
cd LTE/
bash FastChat/ft_train.sh
```
To reduce the total memory footprint, LTE also supports [LoRA](https://arxiv.org/abs/2106.09685), which fine-tunes low-rank slices of the query, key, and value embedding heads.
```bash
cd LTE/
bash FastChat/lora_train.sh
```
#### Qwen-Chat-7B
The code is based on [Qwen](https://github.com/QwenLM/Qwen). Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.
```bash
cd LTE/
bash Qwen/finetune/finetune_ds.sh
```
To reduce the total memory footprint, LTE also supports [LoRA](https://arxiv.org/abs/2106.09685), which fine-tunes low-rank slices of the query, key, and value embedding heads.
```bash
cd LTE/
bash Qwen/finetune/finetune_lora_single_gpu.sh
```
### 2. Inference Phrase
The evaluation of our proposed LTE is based on [EasyEdit](https://github.com/zjunlp/EasyEdit).
Please run the following command for experiments of **LLaMA2-Chat-7B**:
```bash
cd LTE/
bash EasyEdit/run_lte_llama.sh
```
Please run the following command for experiments of **Qwen-Chat-7B**:
```bash
cd LTE/
bash EasyEdit/run_lte_qwen.sh
```
## 📝 Citation
Please cite our paper if you use the data or code in this repo.
```
@misc{jiang2023followbench,
title={FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models},
author={Yuxin Jiang and Yufei Wang and Xingshan Zeng and Wanjun Zhong and Liangyou Li and Fei Mi and Lifeng Shang and Xin Jiang and Qun Liu and Wei Wang},
year={2023},
eprint={2310.20410},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | YuxinJiang/LTE_train_data | [
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"arxiv:2106.09685",
"arxiv:2310.20410",
"region:us"
] | 2024-02-11T13:27:20+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "question-answering"]} | 2024-02-17T02:31:06+00:00 | [
"2106.09685",
"2310.20410"
] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #arxiv-2106.09685 #arxiv-2310.20410 #region-us
|
# Learning to Edit: Aligning LLMs with Knowledge Editing
 framework for effective and efficient knowledge editing of large language models (LLMs).
our LTE framework focuses on teaching LLMs to apply updated knowledge into input questions, inspired by the philosophy of "_Teach a man to fish_."
As the below figure shows, LTE features a two-phase process: (i) the Alignment Phase, which fine-tunes LLMs on a meticulously curated parallel dataset to make reliable, in-scope edits while preserving out-of-scope information and linguistic proficiency; and (ii) the Inference Phase, which employs a retrieval-based mechanism for real-time and mass knowledge editing.
<p align="center">
<br>
<img src="URL width="1200"/>
<br>
</p>
## ️ How to implement
### Requirements
Note: Please use Python 3.10+ for LTE. To get started, simply install conda and run:
### 1. Alignment Phrase
Firstly, please download the training data of LTE from HuggingFace and put it into data/.
#### LLaMA2-Chat-7B
The code is based on FastChat. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.
To reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.
#### Qwen-Chat-7B
The code is based on Qwen. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.
To reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.
### 2. Inference Phrase
The evaluation of our proposed LTE is based on EasyEdit.
Please run the following command for experiments of LLaMA2-Chat-7B:
Please run the following command for experiments of Qwen-Chat-7B:
## Citation
Please cite our paper if you use the data or code in this repo.
| [
"# Learning to Edit: Aligning LLMs with Knowledge Editing\n\n framework for effective and efficient knowledge editing of large language models (LLMs).\nour LTE framework focuses on teaching LLMs to apply updated knowledge into input questions, inspired by the philosophy of \"_Teach a man to fish_.\"\n\nAs the below figure shows, LTE features a two-phase process: (i) the Alignment Phase, which fine-tunes LLMs on a meticulously curated parallel dataset to make reliable, in-scope edits while preserving out-of-scope information and linguistic proficiency; and (ii) the Inference Phase, which employs a retrieval-based mechanism for real-time and mass knowledge editing.\n\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"1200\"/>\n <br>\n</p>",
"## ️ How to implement",
"### Requirements\nNote: Please use Python 3.10+ for LTE. To get started, simply install conda and run:",
"### 1. Alignment Phrase\nFirstly, please download the training data of LTE from HuggingFace and put it into data/.",
"#### LLaMA2-Chat-7B\nThe code is based on FastChat. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.\n\n\nTo reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.",
"#### Qwen-Chat-7B\nThe code is based on Qwen. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.\n\n\nTo reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.",
"### 2. Inference Phrase\nThe evaluation of our proposed LTE is based on EasyEdit.\n\nPlease run the following command for experiments of LLaMA2-Chat-7B:\n\n\nPlease run the following command for experiments of Qwen-Chat-7B:",
"## Citation\nPlease cite our paper if you use the data or code in this repo."
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-10K<n<100K #language-English #license-apache-2.0 #arxiv-2106.09685 #arxiv-2310.20410 #region-us \n",
"# Learning to Edit: Aligning LLMs with Knowledge Editing\n\n framework for effective and efficient knowledge editing of large language models (LLMs).\nour LTE framework focuses on teaching LLMs to apply updated knowledge into input questions, inspired by the philosophy of \"_Teach a man to fish_.\"\n\nAs the below figure shows, LTE features a two-phase process: (i) the Alignment Phase, which fine-tunes LLMs on a meticulously curated parallel dataset to make reliable, in-scope edits while preserving out-of-scope information and linguistic proficiency; and (ii) the Inference Phase, which employs a retrieval-based mechanism for real-time and mass knowledge editing.\n\n<p align=\"center\">\n <br>\n <img src=\"URL width=\"1200\"/>\n <br>\n</p>",
"## ️ How to implement",
"### Requirements\nNote: Please use Python 3.10+ for LTE. To get started, simply install conda and run:",
"### 1. Alignment Phrase\nFirstly, please download the training data of LTE from HuggingFace and put it into data/.",
"#### LLaMA2-Chat-7B\nThe code is based on FastChat. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.\n\n\nTo reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.",
"#### Qwen-Chat-7B\nThe code is based on Qwen. Standard fine-tuning was conducted on 4×A100 GPUs (80G) for about 9 hours.\n\n\nTo reduce the total memory footprint, LTE also supports LoRA, which fine-tunes low-rank slices of the query, key, and value embedding heads.",
"### 2. Inference Phrase\nThe evaluation of our proposed LTE is based on EasyEdit.\n\nPlease run the following command for experiments of LLaMA2-Chat-7B:\n\n\nPlease run the following command for experiments of Qwen-Chat-7B:",
"## Citation\nPlease cite our paper if you use the data or code in this repo."
] |
bc3fa3ffbd812d312f19f9cd5d8540ddbc59d713 | # Dataset Card for "ExeBench-Switch-Eval-small-gpt3.5-zeroshot-result"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/ExeBench-Switch-Eval-small-gpt3.5-zeroshot-result | [
"region:us"
] | 2024-02-11T13:27:40+00:00 | {"dataset_info": {"features": [{"name": "c", "dtype": "string"}, {"name": "asm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 223223, "num_examples": 80}], "download_size": 89102, "dataset_size": 223223}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T13:27:47+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ExeBench-Switch-Eval-small-gpt3.5-zeroshot-result"
More Information needed | [
"# Dataset Card for \"ExeBench-Switch-Eval-small-gpt3.5-zeroshot-result\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ExeBench-Switch-Eval-small-gpt3.5-zeroshot-result\"\n\nMore Information needed"
] |
51e0cc2d7f90a945a9e4038a10037582accb5b64 | # Dataset Metadata
## Identification Information
### Citation
- **Title**:Aerial surveys of a sunflower crop’s lifecycle from May to September 2023
- **Originator**: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria
- **Publication Date**: 2023.11.08
### Abstract
Efficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment.
These devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge.
Multispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more.
One of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections.
Similar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers.
However, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well.
Machine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.
For these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.
### Purpose
This dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain.
The provided data can be used for the following scenarios:
1) Training models relying on multispectral datasources.
2) Improve existing algorithms in the computer vision domain.
## Time Period of Content
- **Single Date/Time**: Start Date 2023-04-25 to End Date 2023-09-04
## Data Quality Information
Composite images have been generated with DJI Terra, with 70% frontal and 60% side overlap.
There are instances where a survey has been completed in the span of 2 days due to adverse environment conditions.
Although there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case.
The raw data is validated to be complete - representing the entirety of the observed field for every survey.
### Horizontal Coordinate System
- **Geographic Coordinate System**: EPSG:4326
- **Angular Unit**: Decimal degrees
- **Datum**: WGS 84
- **Prime Meridian**: Greenwich
- **Domain**: Raster
## Entity and Attribute Information
### Detailed Description
#### Entities
Data is organized into directories. Each directory corresponds to one survey and uses **DD.MM.YYYY** format.
Each survey directory contains 2 subdirectories : **raw** and **results**.
results directory is the output from the DJI Terra processing of the raw data, collected by the drone.
- Contents:
- raw
- Composite images, derived from a single drone sensor. Images follow **result_<Blue, Green, etc.>** nomenclature.
- .prj projection file for every composite image
- .tfw georeference file for every composite image
- results
- subdirectories for each executed flight, required to complete the survey.
- each subdirectory keeps the raw data for each sensing point on the drone's mission path
- one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone
All images are injected with geo-referencing data, timestamps, image quality, camera properties.
The datasets hold additional metadata in two files:
- field_shape.geojson - bounding box for the sunflower field
- crop_details.txt - information about the crop
#### Capture aperture
Drone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:
Sensors: Six 1/2.9” CMOS
Filters:
- Blue (B): 450 nm ± 16 nm
- Green (G): 560 nm ± 16 nm
- Red (R): 650 nm ± 16 nm
- Red edge (RE): 730 nm ± 16 nm
- Near-infrared (NIR): 840 nm ± 26 nm
Lenses:
- FOV (Field of View): 62.7°
- Focal Length: 5.74 mm
- Aperture: f/2.2
Software used for generating composite images: DJI Terra 3.6.8.
## Metadata Reference Information
- **Metadata Contact**:
- **Name**: Pavel Genevski
- **Organization**: SAP LABS Bulgaria
- **Position**: Research expert
- **Email**: [email protected]
- **Metadata Contact**:
- **Name**: Radoslav Stefanov
- **Organization**: SAP LABS Bulgaria
- **Position**: Senior developer
- **Email**: [email protected]
- **Metadata Date**: Date of creating this metadata (2023.11.08)
- **Metadata Standard Name**: FGDC Content Standard for Digital Geospatial Metadata
## Additional Information
- **Keywords**: agriculture, multispectral, crop, sunflower
- **Access Constraints**: CC BY 4.0
- **Use Constraints**: CC BY 4.0
| su-fmi/msi-drone-crop-surveys | [
"region:us"
] | 2024-02-11T13:30:53+00:00 | {} | 2024-02-16T17:30:54+00:00 | [] | [] | TAGS
#region-us
| # Dataset Metadata
## Identification Information
- Title:Aerial surveys of a sunflower crop’s lifecycle from May to September 2023
- Originator: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria
- Publication Date: 2023.11.08
### Abstract
Efficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment.
These devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge.
Multispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more.
One of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections.
Similar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers.
However, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well.
Machine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.
For these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.
### Purpose
This dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain.
The provided data can be used for the following scenarios:
1) Training models relying on multispectral datasources.
2) Improve existing algorithms in the computer vision domain.
## Time Period of Content
- Single Date/Time: Start Date 2023-04-25 to End Date 2023-09-04
## Data Quality Information
Composite images have been generated with DJI Terra, with 70% frontal and 60% side overlap.
There are instances where a survey has been completed in the span of 2 days due to adverse environment conditions.
Although there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case.
The raw data is validated to be complete - representing the entirety of the observed field for every survey.
### Horizontal Coordinate System
- Geographic Coordinate System: EPSG:4326
- Angular Unit: Decimal degrees
- Datum: WGS 84
- Prime Meridian: Greenwich
- Domain: Raster
## Entity and Attribute Information
### Detailed Description
#### Entities
Data is organized into directories. Each directory corresponds to one survey and uses DD.MM.YYYY format.
Each survey directory contains 2 subdirectories : raw and results.
results directory is the output from the DJI Terra processing of the raw data, collected by the drone.
- Contents:
- raw
- Composite images, derived from a single drone sensor. Images follow result_<Blue, Green, etc.> nomenclature.
- .prj projection file for every composite image
- .tfw georeference file for every composite image
- results
- subdirectories for each executed flight, required to complete the survey.
- each subdirectory keeps the raw data for each sensing point on the drone's mission path
- one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone
All images are injected with geo-referencing data, timestamps, image quality, camera properties.
The datasets hold additional metadata in two files:
- field_shape.geojson - bounding box for the sunflower field
- crop_details.txt - information about the crop
#### Capture aperture
Drone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:
Sensors: Six 1/2.9” CMOS
Filters:
- Blue (B): 450 nm ± 16 nm
- Green (G): 560 nm ± 16 nm
- Red (R): 650 nm ± 16 nm
- Red edge (RE): 730 nm ± 16 nm
- Near-infrared (NIR): 840 nm ± 26 nm
Lenses:
- FOV (Field of View): 62.7°
- Focal Length: 5.74 mm
- Aperture: f/2.2
Software used for generating composite images: DJI Terra 3.6.8.
## Metadata Reference Information
- Metadata Contact:
- Name: Pavel Genevski
- Organization: SAP LABS Bulgaria
- Position: Research expert
- Email: pavel.genevski@URL
- Metadata Contact:
- Name: Radoslav Stefanov
- Organization: SAP LABS Bulgaria
- Position: Senior developer
- Email: radoslav.stefanov@URL
- Metadata Date: Date of creating this metadata (2023.11.08)
- Metadata Standard Name: FGDC Content Standard for Digital Geospatial Metadata
## Additional Information
- Keywords: agriculture, multispectral, crop, sunflower
- Access Constraints: CC BY 4.0
- Use Constraints: CC BY 4.0
| [
"# Dataset Metadata",
"## Identification Information\n\n- Title:Aerial surveys of a sunflower crop’s lifecycle from May to September 2023 \n- Originator: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria \n- Publication Date: 2023.11.08",
"### Abstract\n\nEfficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment.\nThese devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge. \n\nMultispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more.\nOne of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections.\nSimilar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers.\nHowever, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well.\nMachine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.\n\nFor these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.",
"### Purpose\n\nThis dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain.\nThe provided data can be used for the following scenarios:\n1) Training models relying on multispectral datasources.\n2) Improve existing algorithms in the computer vision domain.",
"## Time Period of Content\n\n- Single Date/Time: Start Date 2023-04-25 to End Date 2023-09-04",
"## Data Quality Information\n\nComposite images have been generated with DJI Terra, with 70% frontal and 60% side overlap.\nThere are instances where a survey has been completed in the span of 2 days due to adverse environment conditions.\nAlthough there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case.\nThe raw data is validated to be complete - representing the entirety of the observed field for every survey.",
"### Horizontal Coordinate System\n\n- Geographic Coordinate System: EPSG:4326\n - Angular Unit: Decimal degrees\n - Datum: WGS 84\n - Prime Meridian: Greenwich\n - Domain: Raster",
"## Entity and Attribute Information",
"### Detailed Description",
"#### Entities\nData is organized into directories. Each directory corresponds to one survey and uses DD.MM.YYYY format.\n\nEach survey directory contains 2 subdirectories : raw and results.\nresults directory is the output from the DJI Terra processing of the raw data, collected by the drone. \n\n- Contents:\n - raw\n - Composite images, derived from a single drone sensor. Images follow result_<Blue, Green, etc.> nomenclature.\n - .prj projection file for every composite image\n - .tfw georeference file for every composite image\n - results\n - subdirectories for each executed flight, required to complete the survey.\n - each subdirectory keeps the raw data for each sensing point on the drone's mission path\n - one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone\n\nAll images are injected with geo-referencing data, timestamps, image quality, camera properties.\n\nThe datasets hold additional metadata in two files:\n - field_shape.geojson - bounding box for the sunflower field\n - crop_details.txt - information about the crop",
"#### Capture aperture\n\nDrone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:\n\nSensors: Six 1/2.9” CMOS \n\nFilters: \n\n - Blue (B): 450 nm ± 16 nm\n - Green (G): 560 nm ± 16 nm\n - Red (R): 650 nm ± 16 nm\n - Red edge (RE): 730 nm ± 16 nm\n - Near-infrared (NIR): 840 nm ± 26 nm\n\nLenses: \n\n- FOV (Field of View): 62.7°\n- Focal Length: 5.74 mm\n- Aperture: f/2.2\n\nSoftware used for generating composite images: DJI Terra 3.6.8.",
"## Metadata Reference Information\n\n- Metadata Contact:\n - Name: Pavel Genevski\n - Organization: SAP LABS Bulgaria\n - Position: Research expert\n - Email: pavel.genevski@URL\n\n- Metadata Contact:\n - Name: Radoslav Stefanov\n - Organization: SAP LABS Bulgaria\n - Position: Senior developer\n - Email: radoslav.stefanov@URL\n\n- Metadata Date: Date of creating this metadata (2023.11.08)\n- Metadata Standard Name: FGDC Content Standard for Digital Geospatial Metadata",
"## Additional Information\n\n- Keywords: agriculture, multispectral, crop, sunflower\n- Access Constraints: CC BY 4.0\n- Use Constraints: CC BY 4.0"
] | [
"TAGS\n#region-us \n",
"# Dataset Metadata",
"## Identification Information\n\n- Title:Aerial surveys of a sunflower crop’s lifecycle from May to September 2023 \n- Originator: Sofia University - faculty of mathematics and informatics, SAP LABS Bulgaria \n- Publication Date: 2023.11.08",
"### Abstract\n\nEfficient food production is shaping up to be one of the new frontiers for new technologies and solutions. One such prominent domain is the remote sensing ecosystem, and more precicely, technologies such as multispectral and hyperspectral sensing equipment.\nThese devices are gradually moving from the academia environment to the industry world, and there decrease is cost allows for many new applications to emerge. \n\nMultispectral drones are advanced unmanned aerial vehicles (UAVs) equipped with cameras or sensors, capable of capturing imagery across multiple spectral bands. Unlike traditional RGB counterparts, they capture data not only within, but also beyond the visible spectrum, such as near-infrared (NIR). This data can provide valuable insights for various applications, including agriculture, environmental monitoring, land surveying, and more.\nOne of the main uses of multispectral drones in agriculture is related to the calculation of vegetation (NDVI, NDRE etc.) and other indices that inform the farmer about crop development, stress etc. The latter can also serve as indirect indicator of soil conditions and water distribution. This approach enables more accurate and detailed assessments compared to traditional visual inspections.\nSimilar multispectral data is provided by earth observation satellites, such as Sentinel-2, however they are limited with respect to revisit time, spatial resolution and most importantly, their inability to see through clouds. Therefore, the use of multispectral drones can fill these operational gaps and provide more precise and timely data to the farmers.\nHowever, to work simultaneously with satellite and drone data, analysts must have confidence in the precision and comparability of these two data sources (e.g., for NDVI). For example, the DJI P4 multispectral images have slightly different band sensitivities when compared with Sentinel-2, which may cause deviations in the index values. Another prominent problem is related to the field illumination, which depends on time of day and weather conditions. Even though the DJI P4 drone has a calibration sensor, supposed to compensate for the illuminating spectrum deviations, to the best of our knowledge, no public data set exists that demonstrates the tolerance of deviations between e.g., different drone footages or between DJI P4 and Sentinel-2. Moreover, Sentinel-2 implements atmospheric corrections that may contribute to such deviations as well.\nMachine learning models can be utilized to extract valuable insights from multispectral data in precision agriculture applications. By leveraging the rich information captured across multiple spectral bands, machine learning algorithms can analyze and interpret the data to provide actionable recommendations for farmers and agronomists, such as highlighting areas with the most vegetation stress. Successful implementation of machine learning models for precision agriculture, based on multispectral data, requires high quality data sets, which are currently scarce. Therefore, collection of a high-quality, multispectral data set is a prerequisite to future machine learning experiments in the domain of precision farming.\n\nFor these reasons, our research team conducted multiple surveys, tracking the entire lifecycle of a sunflower field and gathering spectal data.",
"### Purpose\n\nThis dataset was developed as part of a research project, investigating the capabilities and application of drones and multispectral cameras for the agricultural domain.\nThe provided data can be used for the following scenarios:\n1) Training models relying on multispectral datasources.\n2) Improve existing algorithms in the computer vision domain.",
"## Time Period of Content\n\n- Single Date/Time: Start Date 2023-04-25 to End Date 2023-09-04",
"## Data Quality Information\n\nComposite images have been generated with DJI Terra, with 70% frontal and 60% side overlap.\nThere are instances where a survey has been completed in the span of 2 days due to adverse environment conditions.\nAlthough there was an effort to have surveys execution in a constant time window (morning and afternoon), for some of the runs this is not the case.\nThe raw data is validated to be complete - representing the entirety of the observed field for every survey.",
"### Horizontal Coordinate System\n\n- Geographic Coordinate System: EPSG:4326\n - Angular Unit: Decimal degrees\n - Datum: WGS 84\n - Prime Meridian: Greenwich\n - Domain: Raster",
"## Entity and Attribute Information",
"### Detailed Description",
"#### Entities\nData is organized into directories. Each directory corresponds to one survey and uses DD.MM.YYYY format.\n\nEach survey directory contains 2 subdirectories : raw and results.\nresults directory is the output from the DJI Terra processing of the raw data, collected by the drone. \n\n- Contents:\n - raw\n - Composite images, derived from a single drone sensor. Images follow result_<Blue, Green, etc.> nomenclature.\n - .prj projection file for every composite image\n - .tfw georeference file for every composite image\n - results\n - subdirectories for each executed flight, required to complete the survey.\n - each subdirectory keeps the raw data for each sensing point on the drone's mission path\n - one point is represented by one JPG image and 5 grayscale TIF images, corresponding to each sensor of the drone\n\nAll images are injected with geo-referencing data, timestamps, image quality, camera properties.\n\nThe datasets hold additional metadata in two files:\n - field_shape.geojson - bounding box for the sunflower field\n - crop_details.txt - information about the crop",
"#### Capture aperture\n\nDrone surveys are executed with DJI Phantom 4 Multispectral drone. The drone uses the following sensors to capture data:\n\nSensors: Six 1/2.9” CMOS \n\nFilters: \n\n - Blue (B): 450 nm ± 16 nm\n - Green (G): 560 nm ± 16 nm\n - Red (R): 650 nm ± 16 nm\n - Red edge (RE): 730 nm ± 16 nm\n - Near-infrared (NIR): 840 nm ± 26 nm\n\nLenses: \n\n- FOV (Field of View): 62.7°\n- Focal Length: 5.74 mm\n- Aperture: f/2.2\n\nSoftware used for generating composite images: DJI Terra 3.6.8.",
"## Metadata Reference Information\n\n- Metadata Contact:\n - Name: Pavel Genevski\n - Organization: SAP LABS Bulgaria\n - Position: Research expert\n - Email: pavel.genevski@URL\n\n- Metadata Contact:\n - Name: Radoslav Stefanov\n - Organization: SAP LABS Bulgaria\n - Position: Senior developer\n - Email: radoslav.stefanov@URL\n\n- Metadata Date: Date of creating this metadata (2023.11.08)\n- Metadata Standard Name: FGDC Content Standard for Digital Geospatial Metadata",
"## Additional Information\n\n- Keywords: agriculture, multispectral, crop, sunflower\n- Access Constraints: CC BY 4.0\n- Use Constraints: CC BY 4.0"
] |
fb2fabda39179f07a96066b05e6d342ac0429eb8 |
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.11](https://huggingface.co/SF-Foundation/Ein-72B-v0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T13:40:58.813057](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11/blob/main/results_2024-02-11T13-40-58.813057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.772373168297044,
"acc_stderr": 0.028022585208284104,
"acc_norm": 0.7739457676486081,
"acc_norm_stderr": 0.02857928542974863,
"mc1": 0.6634026927784578,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.790182015835219,
"mc2_stderr": 0.013777445073321324
},
"harness|arc:challenge|25": {
"acc": 0.7474402730375427,
"acc_stderr": 0.012696728980207704,
"acc_norm": 0.7679180887372014,
"acc_norm_stderr": 0.012336718284948856
},
"harness|hellaswag|10": {
"acc": 0.7343158733320055,
"acc_stderr": 0.004407941058874964,
"acc_norm": 0.890161322445728,
"acc_norm_stderr": 0.003120495238827559
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.02269148287203535,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.02269148287203535
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8,
"acc_stderr": 0.026148818018424506,
"acc_norm": 0.8,
"acc_norm_stderr": 0.026148818018424506
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.0345593020192481,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.0345593020192481
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6825396825396826,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.6825396825396826,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8051282051282052,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.8051282051282052,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.030384169232350818,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.030384169232350818
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.0105104947132014,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.0105104947132014
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552104,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9157088122605364,
"acc_stderr": 0.009934966499513791,
"acc_norm": 0.9157088122605364,
"acc_norm_stderr": 0.009934966499513791
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.019829299214925416,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.019829299214925416
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6994413407821229,
"acc_stderr": 0.01533456680625116,
"acc_norm": 0.6994413407821229,
"acc_norm_stderr": 0.01533456680625116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041878,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.021193872528034962,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.021193872528034962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6134289439374185,
"acc_stderr": 0.012437288868088725,
"acc_norm": 0.6134289439374185,
"acc_norm_stderr": 0.012437288868088725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6634026927784578,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.790182015835219,
"mc2_stderr": 0.013777445073321324
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7877179681576952,
"acc_stderr": 0.011263783355400313
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11 | [
"region:us"
] | 2024-02-11T13:43:06+00:00 | {"pretty_name": "Evaluation run of SF-Foundation/Ein-72B-v0.11", "dataset_summary": "Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.11](https://huggingface.co/SF-Foundation/Ein-72B-v0.11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T13:40:58.813057](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.11/blob/main/results_2024-02-11T13-40-58.813057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.772373168297044,\n \"acc_stderr\": 0.028022585208284104,\n \"acc_norm\": 0.7739457676486081,\n \"acc_norm_stderr\": 0.02857928542974863,\n \"mc1\": 0.6634026927784578,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.790182015835219,\n \"mc2_stderr\": 0.013777445073321324\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7474402730375427,\n \"acc_stderr\": 0.012696728980207704,\n \"acc_norm\": 0.7679180887372014,\n \"acc_norm_stderr\": 0.012336718284948856\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7343158733320055,\n \"acc_stderr\": 0.004407941058874964,\n \"acc_norm\": 0.890161322445728,\n \"acc_norm_stderr\": 0.003120495238827559\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.02269148287203535,\n \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.02269148287203535\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424506,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424506\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.0345593020192481,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.0345593020192481\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6825396825396826,\n \"acc_stderr\": 0.023973861998992072,\n \"acc_norm\": 0.6825396825396826,\n \"acc_norm_stderr\": 0.023973861998992072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586361,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586361\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.030384169232350818,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.030384169232350818\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9357798165137615,\n \"acc_stderr\": 0.0105104947132014,\n \"acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.0105104947132014\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552104,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9157088122605364,\n \"acc_stderr\": 0.009934966499513791,\n \"acc_norm\": 0.9157088122605364,\n \"acc_norm_stderr\": 0.009934966499513791\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.838150289017341,\n \"acc_stderr\": 0.019829299214925416,\n \"acc_norm\": 0.838150289017341,\n \"acc_norm_stderr\": 0.019829299214925416\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6994413407821229,\n \"acc_stderr\": 0.01533456680625116,\n \"acc_norm\": 0.6994413407821229,\n \"acc_norm_stderr\": 0.01533456680625116\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041878,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.021193872528034962,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.021193872528034962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6134289439374185,\n \"acc_stderr\": 0.012437288868088725,\n \"acc_norm\": 0.6134289439374185,\n \"acc_norm_stderr\": 0.012437288868088725\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02236867256288675,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02236867256288675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6634026927784578,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.790182015835219,\n \"mc2_stderr\": 0.013777445073321324\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7877179681576952,\n \"acc_stderr\": 0.011263783355400313\n }\n}\n```", "repo_url": "https://huggingface.co/SF-Foundation/Ein-72B-v0.11", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|arc:challenge|25_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|gsm8k|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hellaswag|10_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["**/details_harness|winogrande|5_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T13-40-58.813057.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T13_40_58.813057", "path": ["results_2024-02-11T13-40-58.813057.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T13-40-58.813057.parquet"]}]}]} | 2024-02-11T13:43:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.11
Dataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.11 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T13:40:58.813057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.11\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T13:40:58.813057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.11\n\n\n\nDataset automatically created during the evaluation run of model SF-Foundation/Ein-72B-v0.11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T13:40:58.813057(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
84106eb8a1b3c3c97997d642f0450ef45ae5dbcf | From https://huggingface.co/datasets/mozilla-foundation/common_voice_16_1 but only czech train audio files converted to wav and removed noise using UVR5 | VitekDev/common_voice_16_1_cs_train_wav | [
"language:cs",
"license:cc0-1.0",
"region:us"
] | 2024-02-11T13:44:34+00:00 | {"language": ["cs"], "license": "cc0-1.0"} | 2024-02-11T13:46:30+00:00 | [] | [
"cs"
] | TAGS
#language-Czech #license-cc0-1.0 #region-us
| From URL but only czech train audio files converted to wav and removed noise using UVR5 | [] | [
"TAGS\n#language-Czech #license-cc0-1.0 #region-us \n"
] |
91a3b01245856aef220618118b1282e299649e7e |
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T14:20:18.392173](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta/blob/main/results_2024-02-11T14-20-18.392173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5989068889556914,
"acc_stderr": 0.03306588865476634,
"acc_norm": 0.6081578643232973,
"acc_norm_stderr": 0.03380748896101241,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5376745022515824,
"mc2_stderr": 0.01602462184426783
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256522,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.01433223630679014
},
"harness|hellaswag|10": {
"acc": 0.6338378809002191,
"acc_stderr": 0.004807699539973415,
"acc_norm": 0.817167894841665,
"acc_norm_stderr": 0.003857388613533091
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835772,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835772
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239966,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239966
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458033,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593515,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.015680441518889178,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.015680441518889178
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998562,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5376745022515824,
"mc2_stderr": 0.01602462184426783
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233623
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta | [
"region:us"
] | 2024-02-11T14:22:39+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta](https://huggingface.co/ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T14:20:18.392173](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta/blob/main/results_2024-02-11T14-20-18.392173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5989068889556914,\n \"acc_stderr\": 0.03306588865476634,\n \"acc_norm\": 0.6081578643232973,\n \"acc_norm_stderr\": 0.03380748896101241,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5376745022515824,\n \"mc2_stderr\": 0.01602462184426783\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256522,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679014\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6338378809002191,\n \"acc_stderr\": 0.004807699539973415,\n \"acc_norm\": 0.817167894841665,\n \"acc_norm_stderr\": 0.003857388613533091\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835772,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835772\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239966,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239966\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458033,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458033\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593515,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593515\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.015680441518889178,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.015680441518889178\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n \"acc_stderr\": 0.012596744108998562,\n \"acc_norm\": 0.4178617992177314,\n \"acc_norm_stderr\": 0.012596744108998562\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5376745022515824,\n \"mc2_stderr\": 0.01602462184426783\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233623\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \"acc_stderr\": 0.009041108602874664\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-20-18.392173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["**/details_harness|winogrande|5_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T14-20-18.392173.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T14_20_18.392173", "path": ["results_2024-02-11T14-20-18.392173.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T14-20-18.392173.parquet"]}]}]} | 2024-02-11T14:23:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta
Dataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T14:20:18.392173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:20:18.392173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID-SFT-WoDPO-MixQV2-Zephyr-7b-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:20:18.392173(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
963c139f3e12403b7055febb6415fc79d10238fd |
# Dataset Card for Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID_SFT-WoDPO-WoMixQ](https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T14:26:56.210922](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ/blob/main/results_2024-02-11T14-26-56.210922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5950015763685916,
"acc_stderr": 0.033165134676359634,
"acc_norm": 0.6045796991893356,
"acc_norm_stderr": 0.03392668551306121,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5524968462205945,
"mc2_stderr": 0.01602039404250384
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268448
},
"harness|hellaswag|10": {
"acc": 0.6353316072495518,
"acc_stderr": 0.004803533333364225,
"acc_norm": 0.8168691495717985,
"acc_norm_stderr": 0.003859833044230901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365242,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365242
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164535,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164535
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767745,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016015,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281386,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937148,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508758,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508758
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291484,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.0196758081352815,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.0196758081352815
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5524968462205945,
"mc2_stderr": 0.01602039404250384
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233621
},
"harness|gsm8k|5": {
"acc": 0.09476876421531463,
"acc_stderr": 0.008067791560015442
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ | [
"region:us"
] | 2024-02-11T14:29:19+00:00 | {"pretty_name": "Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [ArianAskari/SOLID_SFT-WoDPO-WoMixQ](https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T14:26:56.210922](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ/blob/main/results_2024-02-11T14-26-56.210922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5950015763685916,\n \"acc_stderr\": 0.033165134676359634,\n \"acc_norm\": 0.6045796991893356,\n \"acc_norm_stderr\": 0.03392668551306121,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5524968462205945,\n \"mc2_stderr\": 0.01602039404250384\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268448\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n \"acc_stderr\": 0.004803533333364225,\n \"acc_norm\": 0.8168691495717985,\n \"acc_norm_stderr\": 0.003859833044230901\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164535,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164535\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767745,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016015,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016015\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281386,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281386\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508758,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508758\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6160130718954249,\n \"acc_stderr\": 0.0196758081352815,\n \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.0196758081352815\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5524968462205945,\n \"mc2_stderr\": 0.01602039404250384\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233621\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09476876421531463,\n \"acc_stderr\": 0.008067791560015442\n }\n}\n```", "repo_url": "https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["**/details_harness|winogrande|5_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T14-26-56.210922.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T14_26_56.210922", "path": ["results_2024-02-11T14-26-56.210922.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T14-26-56.210922.parquet"]}]}]} | 2024-02-11T14:29:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ
Dataset automatically created during the evaluation run of model ArianAskari/SOLID_SFT-WoDPO-WoMixQ on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T14:26:56.210922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID_SFT-WoDPO-WoMixQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:26:56.210922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ\n\n\n\nDataset automatically created during the evaluation run of model ArianAskari/SOLID_SFT-WoDPO-WoMixQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:26:56.210922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
eb990a706ddc97fc48bc2ac6606841189ac3263f |
# Dataset Card for Evaluation run of fhai50032/SamChat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fhai50032/SamChat](https://huggingface.co/fhai50032/SamChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fhai50032__SamChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T14:43:14.190988](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__SamChat/blob/main/results_2024-02-11T14-43-14.190988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5967272209929594,
"acc_stderr": 0.03347226424165155,
"acc_norm": 0.6019495737279535,
"acc_norm_stderr": 0.034142369952769536,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5289794791431763,
"mc2_stderr": 0.015597946760218941
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464398,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6229834694284008,
"acc_stderr": 0.004836486437527259,
"acc_norm": 0.8194582752439753,
"acc_norm_stderr": 0.003838519335886879
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920945,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920945
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443135,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724145,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724145
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652247,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652247
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238163,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281406,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281406
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364805,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364805
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824089,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824089
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39048239895697523,
"acc_stderr": 0.012460135913945073,
"acc_norm": 0.39048239895697523,
"acc_norm_stderr": 0.012460135913945073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401154,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401154
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5289794791431763,
"mc2_stderr": 0.015597946760218941
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798499
},
"harness|gsm8k|5": {
"acc": 0.40636846095526913,
"acc_stderr": 0.013528846685413248
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fhai50032__SamChat | [
"region:us"
] | 2024-02-11T14:45:31+00:00 | {"pretty_name": "Evaluation run of fhai50032/SamChat", "dataset_summary": "Dataset automatically created during the evaluation run of model [fhai50032/SamChat](https://huggingface.co/fhai50032/SamChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fhai50032__SamChat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T14:43:14.190988](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__SamChat/blob/main/results_2024-02-11T14-43-14.190988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5967272209929594,\n \"acc_stderr\": 0.03347226424165155,\n \"acc_norm\": 0.6019495737279535,\n \"acc_norm_stderr\": 0.034142369952769536,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5289794791431763,\n \"mc2_stderr\": 0.015597946760218941\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464398,\n \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6229834694284008,\n \"acc_stderr\": 0.004836486437527259,\n \"acc_norm\": 0.8194582752439753,\n \"acc_norm_stderr\": 0.003838519335886879\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920945,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920945\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443135,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724145,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724145\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652247,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652247\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.7650063856960408,\n \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n \"acc_stderr\": 0.01577491142238163,\n \"acc_norm\": 0.3340782122905028,\n \"acc_norm_stderr\": 0.01577491142238163\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281406,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281406\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.02698147804364805,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.02698147804364805\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824089,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824089\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39048239895697523,\n \"acc_stderr\": 0.012460135913945073,\n \"acc_norm\": 0.39048239895697523,\n \"acc_norm_stderr\": 0.012460135913945073\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401154,\n \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401154\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5289794791431763,\n \"mc2_stderr\": 0.015597946760218941\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40636846095526913,\n \"acc_stderr\": 0.013528846685413248\n }\n}\n```", "repo_url": "https://huggingface.co/fhai50032/SamChat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-43-14.190988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["**/details_harness|winogrande|5_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T14-43-14.190988.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T14_43_14.190988", "path": ["results_2024-02-11T14-43-14.190988.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T14-43-14.190988.parquet"]}]}]} | 2024-02-11T14:45:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fhai50032/SamChat
Dataset automatically created during the evaluation run of model fhai50032/SamChat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T14:43:14.190988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fhai50032/SamChat\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/SamChat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:43:14.190988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fhai50032/SamChat\n\n\n\nDataset automatically created during the evaluation run of model fhai50032/SamChat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:43:14.190988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6ef59d0866564bfdc0f081310e28228e165c29ce | # Dataset Card for "NumericBench-Eval-small-gpt3.5-zeroshot-result"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/NumericBench-Eval-small-gpt3.5-zeroshot-result | [
"region:us"
] | 2024-02-11T14:54:57+00:00 | {"dataset_info": {"features": [{"name": "c", "dtype": "string"}, {"name": "asm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 390885, "num_examples": 400}], "download_size": 123437, "dataset_size": 390885}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T14:55:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "NumericBench-Eval-small-gpt3.5-zeroshot-result"
More Information needed | [
"# Dataset Card for \"NumericBench-Eval-small-gpt3.5-zeroshot-result\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"NumericBench-Eval-small-gpt3.5-zeroshot-result\"\n\nMore Information needed"
] |
d277b0cf61428f2adf0840eac382bf529bb317f8 |
# Dataset Card for Evaluation run of Radu1999/MisterUkrainian
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/MisterUkrainian](https://huggingface.co/Radu1999/MisterUkrainian) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__MisterUkrainian",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T14:53:33.344977](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__MisterUkrainian/blob/main/results_2024-02-11T14-53-33.344977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6281465419161211,
"acc_stderr": 0.03272632448467549,
"acc_norm": 0.6302208861306352,
"acc_norm_stderr": 0.033386791320237544,
"mc1": 0.5104039167686658,
"mc1_stderr": 0.017499711430249268,
"mc2": 0.6726428351055485,
"mc2_stderr": 0.014873571856190641
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6783276450511946,
"acc_norm_stderr": 0.013650488084494162
},
"harness|hellaswag|10": {
"acc": 0.6695877315275841,
"acc_stderr": 0.0046940027819395635,
"acc_norm": 0.8631746664011153,
"acc_norm_stderr": 0.0034296051062163665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122794,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122794
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.02453759157283051,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.02453759157283051
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.01392775137200151,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.01392775137200151
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47150837988826816,
"acc_stderr": 0.0166953297460158,
"acc_norm": 0.47150837988826816,
"acc_norm_stderr": 0.0166953297460158
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263294,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263294
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829734,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829734
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5104039167686658,
"mc1_stderr": 0.017499711430249268,
"mc2": 0.6726428351055485,
"mc2_stderr": 0.014873571856190641
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938285
},
"harness|gsm8k|5": {
"acc": 0.5693707354056103,
"acc_stderr": 0.013639285985979927
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Radu1999__MisterUkrainian | [
"region:us"
] | 2024-02-11T14:55:52+00:00 | {"pretty_name": "Evaluation run of Radu1999/MisterUkrainian", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/MisterUkrainian](https://huggingface.co/Radu1999/MisterUkrainian) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__MisterUkrainian\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T14:53:33.344977](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__MisterUkrainian/blob/main/results_2024-02-11T14-53-33.344977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6281465419161211,\n \"acc_stderr\": 0.03272632448467549,\n \"acc_norm\": 0.6302208861306352,\n \"acc_norm_stderr\": 0.033386791320237544,\n \"mc1\": 0.5104039167686658,\n \"mc1_stderr\": 0.017499711430249268,\n \"mc2\": 0.6726428351055485,\n \"mc2_stderr\": 0.014873571856190641\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6695877315275841,\n \"acc_stderr\": 0.0046940027819395635,\n \"acc_norm\": 0.8631746664011153,\n \"acc_norm_stderr\": 0.0034296051062163665\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.026795560848122794,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.026795560848122794\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.02453759157283051,\n \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.02453759157283051\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.01392775137200151,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.01392775137200151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47150837988826816,\n \"acc_stderr\": 0.0166953297460158,\n \"acc_norm\": 0.47150837988826816,\n \"acc_norm_stderr\": 0.0166953297460158\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829734,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829734\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5104039167686658,\n \"mc1_stderr\": 0.017499711430249268,\n \"mc2\": 0.6726428351055485,\n \"mc2_stderr\": 0.014873571856190641\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5693707354056103,\n \"acc_stderr\": 0.013639285985979927\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/MisterUkrainian", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T14-53-33.344977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["**/details_harness|winogrande|5_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T14-53-33.344977.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T14_53_33.344977", "path": ["results_2024-02-11T14-53-33.344977.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T14-53-33.344977.parquet"]}]}]} | 2024-02-11T14:56:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Radu1999/MisterUkrainian
Dataset automatically created during the evaluation run of model Radu1999/MisterUkrainian on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T14:53:33.344977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Radu1999/MisterUkrainian\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/MisterUkrainian on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:53:33.344977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Radu1999/MisterUkrainian\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/MisterUkrainian on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T14:53:33.344977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7176c351e92692ca65452cb5041a5011cadef43c |
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2](https://huggingface.co/JaeyeonKang/CCK_Asura_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:06:06.601479](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2/blob/main/results_2024-02-11T16-06-06.601479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7449349138432199,
"acc_stderr": 0.028736114047503484,
"acc_norm": 0.748777442238698,
"acc_norm_stderr": 0.029285406139459322,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5697262468044242,
"mc2_stderr": 0.01485199166324778
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726099,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403503
},
"harness|hellaswag|10": {
"acc": 0.6916948814977096,
"acc_stderr": 0.004608495469860377,
"acc_norm": 0.8809002190798646,
"acc_norm_stderr": 0.0032324391398815544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911267,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5529100529100529,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.5529100529100529,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8774193548387097,
"acc_stderr": 0.0186567209917894,
"acc_norm": 0.8774193548387097,
"acc_norm_stderr": 0.0186567209917894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562073,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562073
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.01521676181926259,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.01521676181926259
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.02056753956724681,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.02056753956724681
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.03007801307502206,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.03007801307502206
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116243,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640276,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640276
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.02599837909235651,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.02599837909235651
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005472,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8850574712643678,
"acc_stderr": 0.011405720724593964,
"acc_norm": 0.8850574712643678,
"acc_norm_stderr": 0.011405720724593964
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6189944134078212,
"acc_stderr": 0.016242028834053603,
"acc_norm": 0.6189944134078212,
"acc_norm_stderr": 0.016242028834053603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.021986032182064148,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.021986032182064148
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.02151405158597041,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.02151405158597041
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5710560625814863,
"acc_stderr": 0.012640625443067365,
"acc_norm": 0.5710560625814863,
"acc_norm_stderr": 0.012640625443067365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.015750526284363353,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.015750526284363353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.0196946385566932,
"acc_norm": 0.96,
"acc_norm_stderr": 0.0196946385566932
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5697262468044242,
"mc2_stderr": 0.01485199166324778
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479664
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2 | [
"region:us"
] | 2024-02-11T16:08:28+00:00 | {"pretty_name": "Evaluation run of JaeyeonKang/CCK_Asura_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2](https://huggingface.co/JaeyeonKang/CCK_Asura_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T16:06:06.601479](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2/blob/main/results_2024-02-11T16-06-06.601479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7449349138432199,\n \"acc_stderr\": 0.028736114047503484,\n \"acc_norm\": 0.748777442238698,\n \"acc_norm_stderr\": 0.029285406139459322,\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5697262468044242,\n \"mc2_stderr\": 0.01485199166324778\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726099,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403503\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6916948814977096,\n \"acc_stderr\": 0.004608495469860377,\n \"acc_norm\": 0.8809002190798646,\n \"acc_norm_stderr\": 0.0032324391398815544\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911267,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.029101290698386715,\n \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.029101290698386715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5529100529100529,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.5529100529100529,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8774193548387097,\n \"acc_stderr\": 0.0186567209917894,\n \"acc_norm\": 0.8774193548387097,\n \"acc_norm_stderr\": 0.0186567209917894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998575,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562073,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562073\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.01521676181926259,\n \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.01521676181926259\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.02056753956724681,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.02056753956724681\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4185185185185185,\n \"acc_stderr\": 0.03007801307502206,\n \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.03007801307502206\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116243,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116243\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.031415546294025445,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.031415546294025445\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640276,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640276\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n \"acc_stderr\": 0.02599837909235651,\n \"acc_norm\": 0.8161434977578476,\n \"acc_norm_stderr\": 0.02599837909235651\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005472,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8850574712643678,\n \"acc_stderr\": 0.011405720724593964,\n \"acc_norm\": 0.8850574712643678,\n \"acc_norm_stderr\": 0.011405720724593964\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6189944134078212,\n \"acc_stderr\": 0.016242028834053603,\n \"acc_norm\": 0.6189944134078212,\n \"acc_norm_stderr\": 0.016242028834053603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.021986032182064148,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.021986032182064148\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.02151405158597041,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.02151405158597041\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n \"acc_stderr\": 0.012640625443067365,\n \"acc_norm\": 0.5710560625814863,\n \"acc_norm_stderr\": 0.012640625443067365\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.015750526284363353,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.015750526284363353\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.0196946385566932,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.0196946385566932\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5697262468044242,\n \"mc2_stderr\": 0.01485199166324778\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \"acc_stderr\": 0.013059111935831494\n }\n}\n```", "repo_url": "https://huggingface.co/JaeyeonKang/CCK_Asura_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["**/details_harness|winogrande|5_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T16-06-06.601479.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T16_06_06.601479", "path": ["results_2024-02-11T16-06-06.601479.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T16-06-06.601479.parquet"]}]}]} | 2024-02-11T16:08:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2
Dataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T16:06:06.601479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:06:06.601479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:06:06.601479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3c0f02eefe546217f987cc38e88c76231ce220da |
# Dataset Card for "CIDAR-MCQ-100"
# CIDAR-MCQ-100
CIDAR-MCQ-100 contains **100** multiple-choice questions and answers about the Arabic culture.
## 📚 Datasets Summary
<table>
<tr>
<th>Name</th>
<th>Explanation</th>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar>CIDAR</a></t>
<td>10,000 instructions and responses in Arabic</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-eval-100>CIDAR-EVAL-100</a></t>
<td>100 instructions to evaluate LLMs on cultural relevance</td>
</tr>
<tr>
<td><a href=https://huggingface.co/datasets/arbml/cidar-mcq-100><b>CIDAR-MCQ-100</b></a></t>
<td>100 Multiple choice questions and answers to evaluate LLMs on cultural relevance </td>
</tr>
</table>
<div width="30px" align="center">
| Category | CIDAR-EVAL-100 | <a href=https://huggingface.co/datasets/arbml/cidar-mcq-100><b>CIDAR-MCQ-100</b></a>|
|----------|:-------------:|:------:|
|Food&Drinks | 14 | 8 |
|Names | 14 | 8 |
|Animals | 2 | 4 |
|Language | 10 | 20 |
|Jokes&Puzzles | 3 | 7 |
|Religion | 5 | 10 |
|Business | 6 | 7 |
|Cloths | 4 | 5 |
|Science | 3 | 4 |
|Sports&Games | 4 | 2 |
|Tradition | 4 | 10 |
|Weather | 4 | 2 |
|Geography | 7 | 8 |
|General | 4 | 3 |
|Fonts | 5 | 2 |
|Literature | 10 | 2 |
|Plants | 3 | 0 |
<i>Total</i> | 100 | 100 |
</div>
## 📋 Dataset Structure
- `Question(str)`: Question about the Arabic culture.
- `A(str)`: First choice.
- `B(str)`: Second choice.
- `C(str)`: Third choice.
- `D(str)`: Fourth choice.
- `answer(str)`: The correct choice from A,B,C, and D.
## 📁 Loading The Dataset
You can download the dataset directly from HuggingFace or use the following code:
```python
from datasets import load_dataset
cidar = load_dataset('arbml/CIDAR-MCQ-100')
```
## 📄 Sample From The Dataset:
**Question**: حدد حيوان مشهور في المنطقة
**A**: الجمل
**B**: اللاما
**C**: الكانغرو
**D**: الدب القطبي
**answer**: A
## 🔑 License
The dataset is licensed under **Apache-2.0**. [Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0).
## Citation
```
@misc{alyafeai2024cidar,
title={{CIDAR: Culturally Relevant Instruction Dataset For Arabic}},
author={Zaid Alyafeai and Khalid Almubarak and Ahmed Ashraf and Deema Alnuhait and Saied Alshahrani and Gubran A. Q. Abdulrahman and Gamil Ahmed and Qais Gawah and Zead Saleh and Mustafa Ghaleb and Yousef Ali and Maged S. Al-Shaibani},
year={2024},
eprint={2402.03177},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | arbml/CIDAR-MCQ-100 | [
"task_categories:multiple-choice",
"size_categories:n<1K",
"language:ar",
"license:apache-2.0",
"arxiv:2402.03177",
"region:us"
] | 2024-02-11T16:10:49+00:00 | {"language": ["ar"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["multiple-choice"], "pretty_name": "CIDAR-MCQ-100 ", "dataset_info": {"features": [{"name": "Question", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18899, "num_examples": 100}], "download_size": 13287, "dataset_size": 18899}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T15:45:09+00:00 | [
"2402.03177"
] | [
"ar"
] | TAGS
#task_categories-multiple-choice #size_categories-n<1K #language-Arabic #license-apache-2.0 #arxiv-2402.03177 #region-us
| Dataset Card for "CIDAR-MCQ-100"
================================
CIDAR-MCQ-100
=============
CIDAR-MCQ-100 contains 100 multiple-choice questions and answers about the Arabic culture.
Datasets Summary
----------------
Dataset Structure
-----------------
* 'Question(str)': Question about the Arabic culture.
* 'A(str)': First choice.
* 'B(str)': Second choice.
* 'C(str)': Third choice.
* 'D(str)': Fourth choice.
* 'answer(str)': The correct choice from A,B,C, and D.
Loading The Dataset
-------------------
You can download the dataset directly from HuggingFace or use the following code:
Sample From The Dataset:
------------------------
Question: حدد حيوان مشهور في المنطقة
A: الجمل
B: اللاما
C: الكانغرو
D: الدب القطبي
answer: A
License
-------
The dataset is licensed under Apache-2.0. Apache-2.0.
| [] | [
"TAGS\n#task_categories-multiple-choice #size_categories-n<1K #language-Arabic #license-apache-2.0 #arxiv-2402.03177 #region-us \n"
] |
e7c091a9a73efe93885b840f3129a1353e2bad99 |
# Dataset Card for Evaluation run of Sao10K/Solstice-11B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Solstice-11B-v1](https://huggingface.co/Sao10K/Solstice-11B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Solstice-11B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:13:21.638790](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Solstice-11B-v1/blob/main/results_2024-02-11T16-13-21.638790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6608619004000204,
"acc_stderr": 0.03153065447134328,
"acc_norm": 0.6642049170224315,
"acc_norm_stderr": 0.03215888521103266,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6198092934370032,
"mc2_stderr": 0.015817855595420743
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537302,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.013318528460539419
},
"harness|hellaswag|10": {
"acc": 0.693487353116909,
"acc_stderr": 0.004601029188459101,
"acc_norm": 0.8739294961163115,
"acc_norm_stderr": 0.0033125043674180218
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329262,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329262
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223154,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918853,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918853
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318677,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318677
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.01438552507661157,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.01438552507661157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4960893854748603,
"acc_stderr": 0.016721990073156657,
"acc_norm": 0.4960893854748603,
"acc_norm_stderr": 0.016721990073156657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694905,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694905
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.012770225252255555,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.012770225252255555
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103142,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6198092934370032,
"mc2_stderr": 0.015817855595420743
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838925
},
"harness|gsm8k|5": {
"acc": 0.5079605761940864,
"acc_stderr": 0.013770739063135374
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sao10K__Solstice-11B-v1 | [
"region:us"
] | 2024-02-11T16:15:38+00:00 | {"pretty_name": "Evaluation run of Sao10K/Solstice-11B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Solstice-11B-v1](https://huggingface.co/Sao10K/Solstice-11B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Solstice-11B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T16:13:21.638790](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Solstice-11B-v1/blob/main/results_2024-02-11T16-13-21.638790.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6608619004000204,\n \"acc_stderr\": 0.03153065447134328,\n \"acc_norm\": 0.6642049170224315,\n \"acc_norm_stderr\": 0.03215888521103266,\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6198092934370032,\n \"mc2_stderr\": 0.015817855595420743\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537302,\n \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.013318528460539419\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.693487353116909,\n \"acc_stderr\": 0.004601029188459101,\n \"acc_norm\": 0.8739294961163115,\n \"acc_norm_stderr\": 0.0033125043674180218\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.025733641991838987,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.025733641991838987\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329262,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329262\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223154,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918853,\n \"acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918853\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318677,\n \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318677\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.01438552507661157,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.01438552507661157\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4960893854748603,\n \"acc_stderr\": 0.016721990073156657,\n \"acc_norm\": 0.4960893854748603,\n \"acc_norm_stderr\": 0.016721990073156657\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n \"acc_stderr\": 0.012770225252255555,\n \"acc_norm\": 0.500651890482399,\n \"acc_norm_stderr\": 0.012770225252255555\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103142,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6198092934370032,\n \"mc2_stderr\": 0.015817855595420743\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838925\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5079605761940864,\n \"acc_stderr\": 0.013770739063135374\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Solstice-11B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-13-21.638790.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["**/details_harness|winogrande|5_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T16-13-21.638790.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T16_13_21.638790", "path": ["results_2024-02-11T16-13-21.638790.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T16-13-21.638790.parquet"]}]}]} | 2024-02-11T16:15:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Solstice-11B-v1
Dataset automatically created during the evaluation run of model Sao10K/Solstice-11B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T16:13:21.638790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sao10K/Solstice-11B-v1\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Solstice-11B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:13:21.638790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Solstice-11B-v1\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Solstice-11B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:13:21.638790(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2658b9f191ab477fb17fb25745a8ffce51861849 |
# Dataset Card for Alpaca-Cleaned
- **Repository:** https://github.com/gururise/AlpacaDataCleaned
## Dataset Description
This is a cleaned version of the original Alpaca Dataset released by Stanford. The following issues have been identified in the original release and fixed in this dataset:
1. **Hallucinations:** Many instructions in the original dataset had instructions referencing data on the internet, which just caused GPT3 to hallucinate an answer.
```
"instruction":"Summarize the given article in 200 Words.",
"input": "https://www.bbc.com/news/world-51461830",
"output": "The recent protests in Belarus are part of a growing wave of discontent against the rule of Alexander Lukashenko, who has led the country since 1994. Demonstrations against his government have been ongoing since August, and come in the wake of an election widely seen as rigged. As police and security forces have cracked down on protesters, the movement has only grown, with hundreds of thousands of people taking to the streets to call for new leadership. Human rights activists have accused the government of using extreme violence in order prosecute protesters, leading to an international outcry and sanctions from the European Union. As the situation in Belarus continues to unfold, the international community is monitoring the situation closely - hoping for a peaceful resolution, while also standing firm against the oppressive tactics employed by the Belarusian state."
"instruction": "Classify the following shape according to simple geometry.",
"input": "<Picture Attached>",
"output": "Triangle"
"instruction": "Given a recent news article, write a headline for it.",
"input": "https://www.news.example.com/article-1",
"output": "\"NASA Launches a Mars 2020 Mission to Search for Signs of Ancient Life\""
```
2. **Merged Instructions:** There were many instructions that were merged together in the original dataset for some reason.
```
"instruction": "Should I invest in stocks?",
"input": "",
"output": "It depends on your individual situation. Investing in stocks carries a certain amount of risk but can also offer potential rewards. Make sure to research the different types of stocks, understand the risks and rewards, and develop a plan that works for you.\n##\n18. Instruction: Identify the noun in the following sentence:\n18. Input:\nThe large apple was delicious.\n18. Output:\nApple"
```
3. **Empty outputs:** Some entries in the original dataset had empty outputs.
```
"instruction": "Calculate the area of the following shape in square centimeters.",
"input": "rectangle of size 4 cm x 5 cm",
"output": ""
```
4. **Empty code examples:** Some descriptions in the original dataset were missing code examples, making it difficult to understand the intended behavior of the code.
5. **Instructions to generate images:** Some descriptions in the original dataset included instructions to generate images, something obviously not possible.
```
"instruction": "Create a graphic or logo that visually represents the word \"courage\".",
"input": "",
"output": "<No Output>"
```
6. **N/A outputs:** Some code snippets in the original dataset had N/A outputs.
7. **Inconsistent input field:** The original dataset had inconsistent usage of the input field when it was supposed to be empty.
```
"input":"<no input>"
"input":"No input"
"input":"noinput"
"input":"<noinput>"
```
8. **Wrong answers:** Some instructions/questions in the original dataset had incorrect answers. About 80% of the math problems are estimated to have incorrect answers.
```
"instruction": "Calculate the median of the following data set.",
"input": "1, 2, 4, 5, 8, 9",
"output": "5"
"instruction": "Convert 25m to km.",
"input": "",
"output": "25km"
```
9. **Non-Sensical/Unclear instructions:** Many instructions are unclear, we try to clarify (or re-write) if instructions are non-sensical. Instructions that are slightly unclear, but where one could deduce the meaning are not altered.
```
"instruction": "Freeze the following sample of yogurt for 10 minutes.",
"input": "Yogurt sample",
"output": "<noinput>"
"instruction": "Increase the font size to 12 points.",
"input": "",
"output": "The font size has been increased to 12 points."
```
10. **Extraneous escape and control characters:** The original dataset had several entries with extraneous escape and control characters.
### Original Alpaca Dataset Summary
Alpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's `text-davinci-003` engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.
The authors built on the data generation pipeline from [Self-Instruct framework](https://github.com/yizhongw/self-instruct) and made the following modifications:
- The `text-davinci-003` engine to generate the instruction data instead of `davinci`.
- A [new prompt](https://github.com/tatsu-lab/stanford_alpaca/blob/main/prompt.txt) was written that explicitly gave the requirement of instruction generation to `text-davinci-003`.
- Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.
- The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.
- Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.
This produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500).
In a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by [Self-Instruct](https://github.com/yizhongw/self-instruct/blob/main/data/seed_tasks.jsonl).
### Supported Tasks and Leaderboards
The Alpaca dataset designed for instruction training pretrained language models.
### Languages
The data in Alpaca are in English (BCP-47 en).
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```json
{
"instruction": "Create a classification task by clustering the given list of items.",
"input": "Apples, oranges, bananas, strawberries, pineapples",
"output": "Class 1: Apples, Oranges\nClass 2: Bananas, Strawberries\nClass 3: Pineapples",
"text": "Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nCreate a classification task by clustering the given list of items.\n\n### Input:\nApples, oranges, bananas, strawberries, pineapples\n\n### Response:\nClass 1: Apples, Oranges\nClass 2: Bananas, Strawberries\nClass 3: Pineapples",
}
```
### Data Fields
The data fields are as follows:
* `instruction`: describes the task the model should perform. Each of the 52K instructions is unique.
* `input`: optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. Around 40% of the examples have an input.
* `output`: the answer to the instruction as generated by `text-davinci-003`.
* `text`: the `instruction`, `input` and `output` formatted with the [prompt template](https://github.com/tatsu-lab/stanford_alpaca#data-release) used by the authors for fine-tuning their models.
### Data Splits
| | train |
|---------------|------:|
| alpaca | 52002 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Excerpt the [blog post](https://crfm.stanford.edu/2023/03/13/alpaca.html) accompanying the release of this dataset:
> We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models, resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time, any release carries some risk. First, we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand, this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand, this awareness might incentivize swift defensive action, especially from the academic community, now empowered by the means to perform deeper safety research on such models. Overall, we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe, we believe that releasing the data, model weights, and training code incur minimal further risk, given the simplicity of the recipe. At the same time, releasing these assets has enormous benefits for reproducible science, so that the academic community can use standard datasets, models, and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies. Second, we watermark all the model outputs using the method described in Kirchenbauer et al. 2023, so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally, we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMA’s license agreement. We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However, by installing these mitigations, we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
The `alpaca` data is generated by a language model (`text-davinci-003`) and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
```
### Contributions
[More Information Needed] | abhishekbisaria/Truth | [
"task_categories:text-generation",
"language:en",
"license:cc-by-4.0",
"instruction-finetuning",
"region:us"
] | 2024-02-11T16:18:17+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["text-generation"], "pretty_name": "Alpaca-Cleaned", "tags": ["instruction-finetuning"]} | 2024-02-15T15:30:21+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #license-cc-by-4.0 #instruction-finetuning #region-us
| Dataset Card for Alpaca-Cleaned
===============================
* Repository: URL
Dataset Description
-------------------
This is a cleaned version of the original Alpaca Dataset released by Stanford. The following issues have been identified in the original release and fixed in this dataset:
1. Hallucinations: Many instructions in the original dataset had instructions referencing data on the internet, which just caused GPT3 to hallucinate an answer.
2. Merged Instructions: There were many instructions that were merged together in the original dataset for some reason.
3. Empty outputs: Some entries in the original dataset had empty outputs.
4. Empty code examples: Some descriptions in the original dataset were missing code examples, making it difficult to understand the intended behavior of the code.
5. Instructions to generate images: Some descriptions in the original dataset included instructions to generate images, something obviously not possible.
6. N/A outputs: Some code snippets in the original dataset had N/A outputs.
7. Inconsistent input field: The original dataset had inconsistent usage of the input field when it was supposed to be empty.
8. Wrong answers: Some instructions/questions in the original dataset had incorrect answers. About 80% of the math problems are estimated to have incorrect answers.
9. Non-Sensical/Unclear instructions: Many instructions are unclear, we try to clarify (or re-write) if instructions are non-sensical. Instructions that are slightly unclear, but where one could deduce the meaning are not altered.
10. Extraneous escape and control characters: The original dataset had several entries with extraneous escape and control characters.
### Original Alpaca Dataset Summary
Alpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's 'text-davinci-003' engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.
The authors built on the data generation pipeline from Self-Instruct framework and made the following modifications:
* The 'text-davinci-003' engine to generate the instruction data instead of 'davinci'.
* A new prompt was written that explicitly gave the requirement of instruction generation to 'text-davinci-003'.
* Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.
* The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.
* Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.
This produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500).
In a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by Self-Instruct.
### Supported Tasks and Leaderboards
The Alpaca dataset designed for instruction training pretrained language models.
### Languages
The data in Alpaca are in English (BCP-47 en).
Dataset Structure
-----------------
### Data Instances
An example of "train" looks as follows:
### Data Fields
The data fields are as follows:
* 'instruction': describes the task the model should perform. Each of the 52K instructions is unique.
* 'input': optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. Around 40% of the examples have an input.
* 'output': the answer to the instruction as generated by 'text-davinci-003'.
* 'text': the 'instruction', 'input' and 'output' formatted with the prompt template used by the authors for fine-tuning their models.
### Data Splits
Dataset Creation
----------------
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
Considerations for Using the Data
---------------------------------
### Social Impact of Dataset
Excerpt the blog post accompanying the release of this dataset:
>
> We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models, resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time, any release carries some risk. First, we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand, this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand, this awareness might incentivize swift defensive action, especially from the academic community, now empowered by the means to perform deeper safety research on such models. Overall, we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe, we believe that releasing the data, model weights, and training code incur minimal further risk, given the simplicity of the recipe. At the same time, releasing these assets has enormous benefits for reproducible science, so that the academic community can use standard datasets, models, and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies. Second, we watermark all the model outputs using the method described in Kirchenbauer et al. 2023, so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally, we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMA’s license agreement. We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However, by installing these mitigations, we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models.
>
>
>
### Discussion of Biases
### Other Known Limitations
The 'alpaca' data is generated by a language model ('text-davinci-003') and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.
Additional Information
----------------------
### Dataset Curators
### Licensing Information
The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).
### Contributions
| [
"### Original Alpaca Dataset Summary\n\n\nAlpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's 'text-davinci-003' engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.\n\n\nThe authors built on the data generation pipeline from Self-Instruct framework and made the following modifications:\n\n\n* The 'text-davinci-003' engine to generate the instruction data instead of 'davinci'.\n* A new prompt was written that explicitly gave the requirement of instruction generation to 'text-davinci-003'.\n* Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.\n* The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.\n* Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.\n\n\nThis produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500).\nIn a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by Self-Instruct.",
"### Supported Tasks and Leaderboards\n\n\nThe Alpaca dataset designed for instruction training pretrained language models.",
"### Languages\n\n\nThe data in Alpaca are in English (BCP-47 en).\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn example of \"train\" looks as follows:",
"### Data Fields\n\n\nThe data fields are as follows:\n\n\n* 'instruction': describes the task the model should perform. Each of the 52K instructions is unique.\n* 'input': optional context or input for the task. For example, when the instruction is \"Summarize the following article\", the input is the article. Around 40% of the examples have an input.\n* 'output': the answer to the instruction as generated by 'text-davinci-003'.\n* 'text': the 'instruction', 'input' and 'output' formatted with the prompt template used by the authors for fine-tuning their models.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset\n\n\nExcerpt the blog post accompanying the release of this dataset:\n\n\n\n> \n> We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models, resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time, any release carries some risk. First, we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand, this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand, this awareness might incentivize swift defensive action, especially from the academic community, now empowered by the means to perform deeper safety research on such models. Overall, we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe, we believe that releasing the data, model weights, and training code incur minimal further risk, given the simplicity of the recipe. At the same time, releasing these assets has enormous benefits for reproducible science, so that the academic community can use standard datasets, models, and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies. Second, we watermark all the model outputs using the method described in Kirchenbauer et al. 2023, so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally, we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMA’s license agreement. We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However, by installing these mitigations, we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models.\n> \n> \n>",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nThe 'alpaca' data is generated by a language model ('text-davinci-003') and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information\n\n\nThe dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).",
"### Contributions"
] | [
"TAGS\n#task_categories-text-generation #language-English #license-cc-by-4.0 #instruction-finetuning #region-us \n",
"### Original Alpaca Dataset Summary\n\n\nAlpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's 'text-davinci-003' engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.\n\n\nThe authors built on the data generation pipeline from Self-Instruct framework and made the following modifications:\n\n\n* The 'text-davinci-003' engine to generate the instruction data instead of 'davinci'.\n* A new prompt was written that explicitly gave the requirement of instruction generation to 'text-davinci-003'.\n* Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.\n* The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.\n* Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.\n\n\nThis produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500).\nIn a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by Self-Instruct.",
"### Supported Tasks and Leaderboards\n\n\nThe Alpaca dataset designed for instruction training pretrained language models.",
"### Languages\n\n\nThe data in Alpaca are in English (BCP-47 en).\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn example of \"train\" looks as follows:",
"### Data Fields\n\n\nThe data fields are as follows:\n\n\n* 'instruction': describes the task the model should perform. Each of the 52K instructions is unique.\n* 'input': optional context or input for the task. For example, when the instruction is \"Summarize the following article\", the input is the article. Around 40% of the examples have an input.\n* 'output': the answer to the instruction as generated by 'text-davinci-003'.\n* 'text': the 'instruction', 'input' and 'output' formatted with the prompt template used by the authors for fine-tuning their models.",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset\n\n\nExcerpt the blog post accompanying the release of this dataset:\n\n\n\n> \n> We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models, resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time, any release carries some risk. First, we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand, this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand, this awareness might incentivize swift defensive action, especially from the academic community, now empowered by the means to perform deeper safety research on such models. Overall, we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe, we believe that releasing the data, model weights, and training code incur minimal further risk, given the simplicity of the recipe. At the same time, releasing these assets has enormous benefits for reproducible science, so that the academic community can use standard datasets, models, and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies. Second, we watermark all the model outputs using the method described in Kirchenbauer et al. 2023, so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally, we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMA’s license agreement. We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However, by installing these mitigations, we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models.\n> \n> \n>",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nThe 'alpaca' data is generated by a language model ('text-davinci-003') and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information\n\n\nThe dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).",
"### Contributions"
] |
9547bfea460f70aa7788ab829a36af3ad83b9adb |
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2.1](https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:16:03.001484](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1/blob/main/results_2024-02-11T16-16-03.001484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7485227705373686,
"acc_stderr": 0.028690850662240704,
"acc_norm": 0.7515411094238932,
"acc_norm_stderr": 0.02924740008798966,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6733433696722777,
"mc2_stderr": 0.014930500543970958
},
"harness|arc:challenge|25": {
"acc": 0.6766211604095563,
"acc_stderr": 0.013669421630012127,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7064329814777933,
"acc_stderr": 0.004544651976040094,
"acc_norm": 0.8874726150169289,
"acc_norm_stderr": 0.0031536835304090366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.029162631596843982,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.029162631596843982
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.02554523921025691,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.02554523921025691
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.03664666337225257,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.03664666337225257
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5529100529100529,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.5529100529100529,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7897435897435897,
"acc_stderr": 0.020660597485026935,
"acc_norm": 0.7897435897435897,
"acc_norm_stderr": 0.020660597485026935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235082,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683258,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683258
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7175925925925926,
"acc_stderr": 0.030701372111510934,
"acc_norm": 0.7175925925925926,
"acc_norm_stderr": 0.030701372111510934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564028,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.011124283175851181,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.011124283175851181
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6491620111731844,
"acc_stderr": 0.015961036675230973,
"acc_norm": 0.6491620111731844,
"acc_norm_stderr": 0.015961036675230973
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880973,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880973
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778034,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778034
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5780141843971631,
"acc_stderr": 0.0294621892333706,
"acc_norm": 0.5780141843971631,
"acc_norm_stderr": 0.0294621892333706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5795306388526728,
"acc_stderr": 0.012607654553832703,
"acc_norm": 0.5795306388526728,
"acc_norm_stderr": 0.012607654553832703
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759418,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759418
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.0174953044731879,
"mc2": 0.6733433696722777,
"mc2_stderr": 0.014930500543970958
},
"harness|winogrande|5": {
"acc": 0.8587213891081295,
"acc_stderr": 0.009789206625044573
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1 | [
"region:us"
] | 2024-02-11T16:18:28+00:00 | {"pretty_name": "Evaluation run of JaeyeonKang/CCK_Asura_v2.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2.1](https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T16:16:03.001484](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2.1/blob/main/results_2024-02-11T16-16-03.001484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7485227705373686,\n \"acc_stderr\": 0.028690850662240704,\n \"acc_norm\": 0.7515411094238932,\n \"acc_norm_stderr\": 0.02924740008798966,\n \"mc1\": 0.5152998776009792,\n \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6733433696722777,\n \"mc2_stderr\": 0.014930500543970958\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6766211604095563,\n \"acc_stderr\": 0.013669421630012127,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7064329814777933,\n \"acc_stderr\": 0.004544651976040094,\n \"acc_norm\": 0.8874726150169289,\n \"acc_norm_stderr\": 0.0031536835304090366\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.029162631596843982,\n \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.029162631596843982\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.02544786382510861,\n \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.02544786382510861\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.02554523921025691,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.02554523921025691\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.029101290698386715,\n \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.029101290698386715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.03664666337225257,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.03664666337225257\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5529100529100529,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.5529100529100529,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432302,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7897435897435897,\n \"acc_stderr\": 0.020660597485026935,\n \"acc_norm\": 0.7897435897435897,\n \"acc_norm_stderr\": 0.020660597485026935\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235082,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683258,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683258\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7175925925925926,\n \"acc_stderr\": 0.030701372111510934,\n \"acc_norm\": 0.7175925925925926,\n \"acc_norm_stderr\": 0.030701372111510934\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564028,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564028\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n \"acc_stderr\": 0.011124283175851181,\n \"acc_norm\": 0.8914431673052363,\n \"acc_norm_stderr\": 0.011124283175851181\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6491620111731844,\n \"acc_stderr\": 0.015961036675230973,\n \"acc_norm\": 0.6491620111731844,\n \"acc_norm_stderr\": 0.015961036675230973\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880973,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880973\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5780141843971631,\n \"acc_stderr\": 0.0294621892333706,\n \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.0294621892333706\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5795306388526728,\n \"acc_stderr\": 0.012607654553832703,\n \"acc_norm\": 0.5795306388526728,\n \"acc_norm_stderr\": 0.012607654553832703\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759418,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759418\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n \"mc1_stderr\": 0.0174953044731879,\n \"mc2\": 0.6733433696722777,\n \"mc2_stderr\": 0.014930500543970958\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8587213891081295,\n \"acc_stderr\": 0.009789206625044573\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.012740305717376268\n }\n}\n```", "repo_url": "https://huggingface.co/JaeyeonKang/CCK_Asura_v2.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["**/details_harness|winogrande|5_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T16-16-03.001484.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T16_16_03.001484", "path": ["results_2024-02-11T16-16-03.001484.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T16-16-03.001484.parquet"]}]}]} | 2024-02-11T16:18:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2.1
Dataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T16:16:03.001484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2.1\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:16:03.001484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2.1\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:16:03.001484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
913ca2af23c1874b6f9f2e5fa758a10373502a8e |
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-dare](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:34:01.841503](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare/blob/main/results_2024-02-11T16-34-01.841503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6238865006281977,
"acc_stderr": 0.03273771754882751,
"acc_norm": 0.6230212223075242,
"acc_norm_stderr": 0.03342419316334821,
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.7146460351141442,
"mc2_stderr": 0.015047159780719415
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907597
},
"harness|hellaswag|10": {
"acc": 0.7264489145588529,
"acc_stderr": 0.004448701611795089,
"acc_norm": 0.8879705238000398,
"acc_norm_stderr": 0.003147581209374547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593566,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593566
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611578,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513398,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026992544339297243,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026992544339297243
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902164,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902164
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.7146460351141442,
"mc2_stderr": 0.015047159780719415
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433542
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748831
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare | [
"region:us"
] | 2024-02-11T16:36:23+00:00 | {"pretty_name": "Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare", "dataset_summary": "Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-dare](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T16:34:01.841503](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare/blob/main/results_2024-02-11T16-34-01.841503.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6238865006281977,\n \"acc_stderr\": 0.03273771754882751,\n \"acc_norm\": 0.6230212223075242,\n \"acc_norm_stderr\": 0.03342419316334821,\n \"mc1\": 0.5752753977968176,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.7146460351141442,\n \"mc2_stderr\": 0.015047159780719415\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238361,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907597\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7264489145588529,\n \"acc_stderr\": 0.004448701611795089,\n \"acc_norm\": 0.8879705238000398,\n \"acc_norm_stderr\": 0.003147581209374547\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593566,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593566\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611578,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611578\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n \"acc_stderr\": 0.016018239710513398,\n \"acc_norm\": 0.3564245810055866,\n \"acc_norm_stderr\": 0.016018239710513398\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026992544339297243,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026992544339297243\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902164,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902164\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5752753977968176,\n \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.7146460351141442,\n \"mc2_stderr\": 0.015047159780719415\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433542\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748831\n }\n}\n```", "repo_url": "https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["**/details_harness|winogrande|5_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T16-34-01.841503.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T16_34_01.841503", "path": ["results_2024-02-11T16-34-01.841503.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T16-34-01.841503.parquet"]}]}]} | 2024-02-11T16:36:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare
Dataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-dare on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T16:34:01.841503(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:34:01.841503(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-dare\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0210-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T16:34:01.841503(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b10df6557c8471a8877da4325edaf33ac182ac3a | # Dataset Card for "dropoff-utcustom-EVAL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sam1120/dropoff-utcustom-EVAL | [
"region:us"
] | 2024-02-11T16:56:45+00:00 | {"dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "pixel_values", "dtype": "image"}, {"name": "labels", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 139241854.0, "num_examples": 50}], "download_size": 40271598, "dataset_size": 139241854.0}} | 2024-02-11T17:04:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dropoff-utcustom-EVAL"
More Information needed | [
"# Dataset Card for \"dropoff-utcustom-EVAL\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dropoff-utcustom-EVAL\"\n\nMore Information needed"
] |
22ec36171c8b11ca16e1ea5360635766b79c47de | # Dataset Card for "dropoff-utcustom-TRAIN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sam1120/dropoff-utcustom-TRAIN | [
"region:us"
] | 2024-02-11T16:57:16+00:00 | {"dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "pixel_values", "dtype": "image"}, {"name": "labels", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 142272068.0, "num_examples": 50}], "download_size": 43507500, "dataset_size": 142272068.0}} | 2024-02-11T17:01:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dropoff-utcustom-TRAIN"
More Information needed | [
"# Dataset Card for \"dropoff-utcustom-TRAIN\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dropoff-utcustom-TRAIN\"\n\nMore Information needed"
] |
a262745b4f7f89c0b90d8bae07bbad8278f800cc | # Dataset Card for "dropoff-utcustom-TEST"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sam1120/dropoff-utcustom-TEST | [
"region:us"
] | 2024-02-11T16:58:25+00:00 | {"dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "pixel_values", "dtype": "image"}, {"name": "labels", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 286143613.0, "num_examples": 101}], "download_size": 86640225, "dataset_size": 286143613.0}} | 2024-02-11T17:00:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dropoff-utcustom-TEST"
More Information needed | [
"# Dataset Card for \"dropoff-utcustom-TEST\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dropoff-utcustom-TEST\"\n\nMore Information needed"
] |
a77ae3f6dbfa5835d7ee73593ace0dfb536d294e |
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0211-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0211-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0211-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T17:47:26.168272](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties/blob/main/results_2024-02-11T17-47-26.168272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6456432854764532,
"acc_stderr": 0.03218918772125075,
"acc_norm": 0.644583259854297,
"acc_norm_stderr": 0.03286610473883484,
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7146037286866871,
"mc2_stderr": 0.01481600451729246
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623494,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537377
},
"harness|hellaswag|10": {
"acc": 0.7193786098386775,
"acc_stderr": 0.004483845735187828,
"acc_norm": 0.888568014339773,
"acc_norm_stderr": 0.0031402323925687962
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099867,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7146037286866871,
"mc2_stderr": 0.01481600451729246
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873502
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.012541830815461492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties | [
"region:us"
] | 2024-02-11T17:49:45+00:00 | {"pretty_name": "Evaluation run of louisbrulenaudet/Pearl-7B-0211-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0211-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0211-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T17:47:26.168272](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties/blob/main/results_2024-02-11T17-47-26.168272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6456432854764532,\n \"acc_stderr\": 0.03218918772125075,\n \"acc_norm\": 0.644583259854297,\n \"acc_norm_stderr\": 0.03286610473883484,\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7146037286866871,\n \"mc2_stderr\": 0.01481600451729246\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623494,\n \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537377\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7193786098386775,\n \"acc_stderr\": 0.004483845735187828,\n \"acc_norm\": 0.888568014339773,\n \"acc_norm_stderr\": 0.0031402323925687962\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099867,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099867\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7146037286866871,\n \"mc2_stderr\": 0.01481600451729246\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873502\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \"acc_stderr\": 0.012541830815461492\n }\n}\n```", "repo_url": "https://huggingface.co/louisbrulenaudet/Pearl-7B-0211-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|arc:challenge|25_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|gsm8k|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hellaswag|10_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T17-47-26.168272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["**/details_harness|winogrande|5_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T17-47-26.168272.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T17_47_26.168272", "path": ["results_2024-02-11T17-47-26.168272.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T17-47-26.168272.parquet"]}]}]} | 2024-02-11T17:50:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0211-ties
Dataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0211-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T17:47:26.168272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0211-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0211-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T17:47:26.168272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0211-ties\n\n\n\nDataset automatically created during the evaluation run of model louisbrulenaudet/Pearl-7B-0211-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T17:47:26.168272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ebe9b78c3148590db6fb1aef6d9a5bfe3bed2b8e |
# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T17:52:31.585367](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2/blob/main/results_2024-02-11T17-52-31.585367.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562860847489057,
"acc_stderr": 0.032004990310050704,
"acc_norm": 0.654707691151165,
"acc_norm_stderr": 0.03269649322595469,
"mc1": 0.6193390452876377,
"mc1_stderr": 0.01699762787190791,
"mc2": 0.7452565487832791,
"mc2_stderr": 0.014341967286352852
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7440273037542662,
"acc_norm_stderr": 0.012753013241244521
},
"harness|hellaswag|10": {
"acc": 0.7267476598287194,
"acc_stderr": 0.004447185883327435,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.0133878957315436,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.0133878957315436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6193390452876377,
"mc1_stderr": 0.01699762787190791,
"mc2": 0.7452565487832791,
"mc2_stderr": 0.014341967286352852
},
"harness|winogrande|5": {
"acc": 0.8839779005524862,
"acc_stderr": 0.009000656983537947
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.012425078188395982
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2 | [
"region:us"
] | 2024-02-11T17:54:47+00:00 | {"pretty_name": "Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2](https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T17:52:31.585367](https://huggingface.co/datasets/open-llm-leaderboard/details_RubielLabarta__LogoS-7Bx2-MoE-13B-v0.2/blob/main/results_2024-02-11T17-52-31.585367.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562860847489057,\n \"acc_stderr\": 0.032004990310050704,\n \"acc_norm\": 0.654707691151165,\n \"acc_norm_stderr\": 0.03269649322595469,\n \"mc1\": 0.6193390452876377,\n \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7452565487832791,\n \"mc2_stderr\": 0.014341967286352852\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n \"acc_norm\": 0.7440273037542662,\n \"acc_norm_stderr\": 0.012753013241244521\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7267476598287194,\n \"acc_stderr\": 0.004447185883327435,\n \"acc_norm\": 0.8908583947420833,\n \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.0133878957315436,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.0133878957315436\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6193390452876377,\n \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7452565487832791,\n \"mc2_stderr\": 0.014341967286352852\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8839779005524862,\n \"acc_stderr\": 0.009000656983537947\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \"acc_stderr\": 0.012425078188395982\n }\n}\n```", "repo_url": "https://huggingface.co/RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|arc:challenge|25_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|gsm8k|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hellaswag|10_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["**/details_harness|winogrande|5_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T17-52-31.585367.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T17_52_31.585367", "path": ["results_2024-02-11T17-52-31.585367.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T17-52-31.585367.parquet"]}]}]} | 2024-02-11T17:55:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2
Dataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T17:52:31.585367(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T17:52:31.585367(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T17:52:31.585367(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0dc0c9502275837d602edf082bc9eb774fd5f2db |
# Dataset Card for Evaluation run of Inv/Konstanta-Gamma-10.9B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-Gamma-10.9B](https://huggingface.co/Inv/Konstanta-Gamma-10.9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-Gamma-10.9B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T18:02:06.510209](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-Gamma-10.9B/blob/main/results_2024-02-11T18-02-06.510209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6475009279331201,
"acc_stderr": 0.03221902383694426,
"acc_norm": 0.6494776456880369,
"acc_norm_stderr": 0.032866711896788726,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6418154025652248,
"mc2_stderr": 0.015280375375811582
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063232,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038167
},
"harness|hellaswag|10": {
"acc": 0.6982672774347739,
"acc_stderr": 0.0045807181159925065,
"acc_norm": 0.8738299143596893,
"acc_norm_stderr": 0.003313623560164932
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924006,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947409,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290923,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.015801003729145904,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.015801003729145904
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.02472386150477169,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.02472386150477169
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.01274520462608314,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.01274520462608314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827047,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827047
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072768,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072768
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6418154025652248,
"mc2_stderr": 0.015280375375811582
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.011030335798617443
},
"harness|gsm8k|5": {
"acc": 0.5731614859742229,
"acc_stderr": 0.01362424969659522
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Inv__Konstanta-Gamma-10.9B | [
"region:us"
] | 2024-02-11T18:04:23+00:00 | {"pretty_name": "Evaluation run of Inv/Konstanta-Gamma-10.9B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Inv/Konstanta-Gamma-10.9B](https://huggingface.co/Inv/Konstanta-Gamma-10.9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-Gamma-10.9B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T18:02:06.510209](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-Gamma-10.9B/blob/main/results_2024-02-11T18-02-06.510209.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6475009279331201,\n \"acc_stderr\": 0.03221902383694426,\n \"acc_norm\": 0.6494776456880369,\n \"acc_norm_stderr\": 0.032866711896788726,\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6418154025652248,\n \"mc2_stderr\": 0.015280375375811582\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063232,\n \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6982672774347739,\n \"acc_stderr\": 0.0045807181159925065,\n \"acc_norm\": 0.8738299143596893,\n \"acc_norm_stderr\": 0.003313623560164932\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290923,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290923\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.015801003729145904,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.015801003729145904\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.02472386150477169,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.02472386150477169\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005723,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005723\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.01274520462608314,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.01274520462608314\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827047,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827047\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072768,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072768\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6418154025652248,\n \"mc2_stderr\": 0.015280375375811582\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.011030335798617443\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5731614859742229,\n \"acc_stderr\": 0.01362424969659522\n }\n}\n```", "repo_url": "https://huggingface.co/Inv/Konstanta-Gamma-10.9B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-02-06.510209.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["**/details_harness|winogrande|5_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T18-02-06.510209.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T18_02_06.510209", "path": ["results_2024-02-11T18-02-06.510209.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T18-02-06.510209.parquet"]}]}]} | 2024-02-11T18:04:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Inv/Konstanta-Gamma-10.9B
Dataset automatically created during the evaluation run of model Inv/Konstanta-Gamma-10.9B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T18:02:06.510209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Inv/Konstanta-Gamma-10.9B\n\n\n\nDataset automatically created during the evaluation run of model Inv/Konstanta-Gamma-10.9B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:02:06.510209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Inv/Konstanta-Gamma-10.9B\n\n\n\nDataset automatically created during the evaluation run of model Inv/Konstanta-Gamma-10.9B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:02:06.510209(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d9b5b635241fbac5528f7e88514b390b6b8023f1 |
# Dataset Card for Evaluation run of prince-canuma/Damysus-2.7B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [prince-canuma/Damysus-2.7B-Chat](https://huggingface.co/prince-canuma/Damysus-2.7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T18:24:01.869694](https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat/blob/main/results_2024-02-11T18-24-01.869694.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5656611822068842,
"acc_stderr": 0.03387832160241618,
"acc_norm": 0.5669534270764037,
"acc_norm_stderr": 0.034575750321491405,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.46445563407786983,
"mc2_stderr": 0.01530465594343596
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.01444946427886881,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345426998
},
"harness|hellaswag|10": {
"acc": 0.5632344154550887,
"acc_stderr": 0.004949716368890488,
"acc_norm": 0.7435769766978689,
"acc_norm_stderr": 0.00435765648543858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296563,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296563
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415895,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686857,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686857
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.033540924375915195,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.033540924375915195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040353,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040353
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.016688893310803768,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.016688893310803768
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654075,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654075
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850412,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.01255259895856366,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.01255259895856366
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590884,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590884
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317918,
"mc2": 0.46445563407786983,
"mc2_stderr": 0.01530465594343596
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930692
},
"harness|gsm8k|5": {
"acc": 0.5018953752843063,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat | [
"region:us"
] | 2024-02-11T18:06:37+00:00 | {"pretty_name": "Evaluation run of prince-canuma/Damysus-2.7B-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [prince-canuma/Damysus-2.7B-Chat](https://huggingface.co/prince-canuma/Damysus-2.7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T18:24:01.869694](https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat/blob/main/results_2024-02-11T18-24-01.869694.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5656611822068842,\n \"acc_stderr\": 0.03387832160241618,\n \"acc_norm\": 0.5669534270764037,\n \"acc_norm_stderr\": 0.034575750321491405,\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.46445563407786983,\n \"mc2_stderr\": 0.01530465594343596\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.01444946427886881,\n \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345426998\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5632344154550887,\n \"acc_stderr\": 0.004949716368890488,\n \"acc_norm\": 0.7435769766978689,\n \"acc_norm_stderr\": 0.00435765648543858\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296563,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296563\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686857,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686857\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.033540924375915195,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.033540924375915195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.026453508054040353,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.026453508054040353\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n \"acc_stderr\": 0.016688893310803768,\n \"acc_norm\": 0.6794380587484036,\n \"acc_norm_stderr\": 0.016688893310803768\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654075,\n \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654075\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850412,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850412\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n \"acc_stderr\": 0.01255259895856366,\n \"acc_norm\": 0.40808344198174706,\n \"acc_norm_stderr\": 0.01255259895856366\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590884,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590884\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824564,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.01625524199317918,\n \"mc2\": 0.46445563407786983,\n \"mc2_stderr\": 0.01530465594343596\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5018953752843063,\n \"acc_stderr\": 0.013772385765569753\n }\n}\n```", "repo_url": "https://huggingface.co/prince-canuma/Damysus-2.7B-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-04-55.056772.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-24-01.869694.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["**/details_harness|winogrande|5_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["**/details_harness|winogrande|5_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T18-24-01.869694.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T18_04_55.056772", "path": ["results_2024-02-11T18-04-55.056772.parquet"]}, {"split": "2024_02_11T18_24_01.869694", "path": ["results_2024-02-11T18-24-01.869694.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T18-24-01.869694.parquet"]}]}]} | 2024-02-11T18:26:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of prince-canuma/Damysus-2.7B-Chat
Dataset automatically created during the evaluation run of model prince-canuma/Damysus-2.7B-Chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T18:24:01.869694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of prince-canuma/Damysus-2.7B-Chat\n\n\n\nDataset automatically created during the evaluation run of model prince-canuma/Damysus-2.7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:24:01.869694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of prince-canuma/Damysus-2.7B-Chat\n\n\n\nDataset automatically created during the evaluation run of model prince-canuma/Damysus-2.7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:24:01.869694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
abe91ddda3e5b1ade9338a13af1478b4c6743611 | # Dataset Card for "CorningAI-DocQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | myngsoooo/CorningAI-DocQA | [
"region:us"
] | 2024-02-11T18:13:05+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7216154, "num_examples": 3546}, {"name": "validation", "num_bytes": 383213, "num_examples": 187}, {"name": "test", "num_bytes": 383213, "num_examples": 187}], "download_size": 3459100, "dataset_size": 7982580}} | 2024-02-11T18:13:16+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "CorningAI-DocQA"
More Information needed | [
"# Dataset Card for \"CorningAI-DocQA\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"CorningAI-DocQA\"\n\nMore Information needed"
] |
94b78f75c3a1e35149c225d08fb011afc5584b84 | # Dataset Card for "boolq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | skrishna/boolq | [
"region:us"
] | 2024-02-11T18:45:19+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "bool"}, {"name": "passage", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12764501, "num_examples": 9427}, {"name": "test", "num_bytes": 4379782, "num_examples": 3270}], "download_size": 10122256, "dataset_size": 17144283}} | 2024-02-11T19:28:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "boolq"
More Information needed | [
"# Dataset Card for \"boolq\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"boolq\"\n\nMore Information needed"
] |
70594a58381f5f1ebb46d67515d874f6e2d4a6fe |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mbert
Dataset: TASTEset
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/EW-TT-MT_LOC_U1_S0_DROP1_mbert | [
"region:us"
] | 2024-02-11T18:54:03+00:00 | {} | 2024-02-11T18:54:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mbert
Dataset: TASTEset
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9da8067216439bbde7ab8960e617d7f285b4c5b4 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mbert
Dataset: TASTEset
Unshuffled ratio: 0
Shuffled ratio: 1
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/EW-TT-MT_LOC_U0_S1_DROP1_mbert | [
"region:us"
] | 2024-02-11T18:54:51+00:00 | {} | 2024-02-11T18:54:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mbert
Dataset: TASTEset
Unshuffled ratio: 0
Shuffled ratio: 1
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a51524b32c7be3c8dc1d5157ca176aeeda65cd7d |
# Dataset Card for Evaluation run of macadeliccc/MBX-7B-v3-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T18:53:41.876317](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO/blob/main/results_2024-02-11T18-53-41.876317.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554435066939919,
"acc_stderr": 0.03198817220538892,
"acc_norm": 0.6546676568515765,
"acc_norm_stderr": 0.03266175930986744,
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.7399782698428227,
"mc2_stderr": 0.014395363250478046
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.8910575582553276,
"acc_norm_stderr": 0.003109302300176215
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055277,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.7399782698428227,
"mc2_stderr": 0.014395363250478046
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079232
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO | [
"region:us"
] | 2024-02-11T18:56:01+00:00 | {"pretty_name": "Evaluation run of macadeliccc/MBX-7B-v3-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/MBX-7B-v3-DPO](https://huggingface.co/macadeliccc/MBX-7B-v3-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T18:53:41.876317](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MBX-7B-v3-DPO/blob/main/results_2024-02-11T18-53-41.876317.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554435066939919,\n \"acc_stderr\": 0.03198817220538892,\n \"acc_norm\": 0.6546676568515765,\n \"acc_norm_stderr\": 0.03266175930986744,\n \"mc1\": 0.5862913096695227,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7399782698428227,\n \"mc2_stderr\": 0.014395363250478046\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8910575582553276,\n \"acc_norm_stderr\": 0.003109302300176215\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055277,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055277\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5862913096695227,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.7399782698428227,\n \"mc2_stderr\": 0.014395363250478046\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079232\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \"acc_stderr\": 0.012661502663418697\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/MBX-7B-v3-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["**/details_harness|winogrande|5_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T18-53-41.876317.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T18_53_41.876317", "path": ["results_2024-02-11T18-53-41.876317.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T18-53-41.876317.parquet"]}]}]} | 2024-02-11T18:56:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/MBX-7B-v3-DPO
Dataset automatically created during the evaluation run of model macadeliccc/MBX-7B-v3-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T18:53:41.876317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/MBX-7B-v3-DPO\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/MBX-7B-v3-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:53:41.876317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/MBX-7B-v3-DPO\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/MBX-7B-v3-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T18:53:41.876317(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5cee83a96f25e38faec70ab80c837324c386bb30 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mdeberta
Dataset: TASTEset
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/EW-TT-MT_LOC_U1_S0_DROP1_mdeberta | [
"region:us"
] | 2024-02-11T18:56:15+00:00 | {} | 2024-02-11T18:56:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mdeberta
Dataset: TASTEset
Unshuffled ratio: 1
Shuffled ratio: 0
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 1\n\n Shuffled ratio: 0\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1b4420ac05b2295d4a32e3f12f22da1e29659e87 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
Tokenizer: mdeberta
Dataset: TASTEset
Unshuffled ratio: 0
Shuffled ratio: 1
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | pgajo/EW-TT-MT_LOC_U0_S1_DROP1_mdeberta | [
"region:us"
] | 2024-02-11T18:56:36+00:00 | {} | 2024-02-11T18:56:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
Tokenizer: mdeberta
Dataset: TASTEset
Unshuffled ratio: 0
Shuffled ratio: 1
Drop duplicates: True
Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-MT_LOC.json",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a5f00befe05fcca60c5456b8033739ab1dee1ce4 | # Dataset Card for "Zeus-v0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | erfanzar/Zeus-v0.1 | [
"region:us"
] | 2024-02-11T19:16:18+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}, {"name": "weight", "dtype": "float64"}]}, {"name": "source", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "model", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 651231416, "num_examples": 386175}], "download_size": 327788195, "dataset_size": 651231416}} | 2024-02-11T19:16:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Zeus-v0.1"
More Information needed | [
"# Dataset Card for \"Zeus-v0.1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Zeus-v0.1\"\n\nMore Information needed"
] |
ff58eebd7fae696554a0adb337404e02106efda6 |
Duplicated from `philschmid/sharegpt-raw` which is marked as duppliated from `jeffwan/sharegpt_vicuna` until step 3.
Then processed with the [Better Uncensored (BUn) pipeline](https://huggingface.co/sudoaza/better-uncensored). A version with long conversations split is also provided.
Now we have the cleaned uncensored dataset in `sharegpt_20230401_clean_bun.json` (57058 conversations) and the same with split long conversations in `sharegpt_20230401_clean_split_bun.json` (103152 conversations).
This latest one should be a drop-in replacement for `anon8231489123/ShareGPT_Vicuna_unfiltered`.
**Note:** the BUn pipeline removes mostly unicode conversations, so this would not be usable for mainly non-ASCII languages like Chineese, Russian, etc.
## Prepraration
```
pip3 install -r requirements.txt
```
## Data Cleaning
1. merge two raw json files and json beautify the merged file
```
python merge.py sharegpt_90k_raw_dataset/sg_90k_part1.json sharegpt_90k_raw_dataset/sg_90k_part2.json sharegpt_20230401_html_unformatted.json
python pretty_json.py --in sharegpt_20230401_html_unformatted.json --out sharegpt_20230401_html.json
```
2. (Optional) Verify the json file
```
if jq empty sharegpt_20230401_html.json 2>/dev/null; then
echo "JSON is valid"
else
echo "JSON is invalid"
fi
jq length sharegpt_90k_raw_dataset/sg_90k_part1.json
jq length sharegpt_90k_raw_dataset/sg_90k_part2.json
jq length sharegpt_20230401_html.json
```
3. clean data - remove html tags etc
```
python3 clean_sharegpt.py --in sharegpt_20230401_html.json --out sharegpt_20230401_clean.json
....
100%|███████████████████████████████████████████████████████████████████| 90665/90665 [06:32<00:00, 230.98it/s]
total: 90665, skip: 13745, new: 76920
```
4. uncensor with BUn
```
python uncensor_sharegpt.py --in-file sharegpt_20230401_clean.json --out-file sharegpt_20230401_clean_bun.json
....
total: 76920, skip: 19862, new: 57058, uncen: 0
```
5. Split the long conversation
```
python -m fastchat.data.split_long_conversation --in sharegpt_20230401_clean_bun.json --out sharegpt_20230401_clean_split_bun.json --model-name meta-llama/Llama-2-13b-hf
...
#in: 57058, #out: 103152
```
Now we have the cleaned uncensored dataset in `sharegpt_20230401_clean_bun.json` (57058 conversations) and the same with split long conversations in `sharegpt_20230401_clean_split_bun.json` (103152 conversations).
This latest one should be a drop-in replacement for `anon8231489123/ShareGPT_Vicuna_unfiltered`.
| betteruncensored/sharegpt | [
"license:other",
"region:us"
] | 2024-02-11T19:26:57+00:00 | {"license": "other", "duplicated_from": "jeffwan/sharegpt_vicuna"} | 2024-02-12T22:23:06+00:00 | [] | [] | TAGS
#license-other #region-us
|
Duplicated from 'philschmid/sharegpt-raw' which is marked as duppliated from 'jeffwan/sharegpt_vicuna' until step 3.
Then processed with the Better Uncensored (BUn) pipeline. A version with long conversations split is also provided.
Now we have the cleaned uncensored dataset in 'sharegpt_20230401_clean_bun.json' (57058 conversations) and the same with split long conversations in 'sharegpt_20230401_clean_split_bun.json' (103152 conversations).
This latest one should be a drop-in replacement for 'anon8231489123/ShareGPT_Vicuna_unfiltered'.
Note: the BUn pipeline removes mostly unicode conversations, so this would not be usable for mainly non-ASCII languages like Chineese, Russian, etc.
## Prepraration
## Data Cleaning
1. merge two raw json files and json beautify the merged file
2. (Optional) Verify the json file
3. clean data - remove html tags etc
4. uncensor with BUn
5. Split the long conversation
Now we have the cleaned uncensored dataset in 'sharegpt_20230401_clean_bun.json' (57058 conversations) and the same with split long conversations in 'sharegpt_20230401_clean_split_bun.json' (103152 conversations).
This latest one should be a drop-in replacement for 'anon8231489123/ShareGPT_Vicuna_unfiltered'.
| [
"## Prepraration",
"## Data Cleaning\n\n1. merge two raw json files and json beautify the merged file\n\n\n\n2. (Optional) Verify the json file\n\n\n\n3. clean data - remove html tags etc\n\n\n\n4. uncensor with BUn\n\n\n\n5. Split the long conversation\n\n\n\nNow we have the cleaned uncensored dataset in 'sharegpt_20230401_clean_bun.json' (57058 conversations) and the same with split long conversations in 'sharegpt_20230401_clean_split_bun.json' (103152 conversations).\n\nThis latest one should be a drop-in replacement for 'anon8231489123/ShareGPT_Vicuna_unfiltered'."
] | [
"TAGS\n#license-other #region-us \n",
"## Prepraration",
"## Data Cleaning\n\n1. merge two raw json files and json beautify the merged file\n\n\n\n2. (Optional) Verify the json file\n\n\n\n3. clean data - remove html tags etc\n\n\n\n4. uncensor with BUn\n\n\n\n5. Split the long conversation\n\n\n\nNow we have the cleaned uncensored dataset in 'sharegpt_20230401_clean_bun.json' (57058 conversations) and the same with split long conversations in 'sharegpt_20230401_clean_split_bun.json' (103152 conversations).\n\nThis latest one should be a drop-in replacement for 'anon8231489123/ShareGPT_Vicuna_unfiltered'."
] |
5bf46f8cf5c8dd553d394248fe3e69aa9bc021ba |
# Dataset Card for Evaluation run of yam-peleg/Experiment7-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment7-7B](https://huggingface.co/yam-peleg/Experiment7-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment7-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T19:25:35.851401](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment7-7B/blob/main/results_2024-02-11T19-25-35.851401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6567730273281559,
"acc_stderr": 0.03199263283100071,
"acc_norm": 0.6574973933898954,
"acc_norm_stderr": 0.03264096338099939,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7059020061319847,
"mc2_stderr": 0.01499668880054581
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068744,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715352,
"acc_norm": 0.8804023102967536,
"acc_norm_stderr": 0.003238273295284749
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269073,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7059020061319847,
"mc2_stderr": 0.01499668880054581
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930324
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yam-peleg__Experiment7-7B | [
"region:us"
] | 2024-02-11T19:27:54+00:00 | {"pretty_name": "Evaluation run of yam-peleg/Experiment7-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yam-peleg/Experiment7-7B](https://huggingface.co/yam-peleg/Experiment7-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment7-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T19:25:35.851401](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment7-7B/blob/main/results_2024-02-11T19-25-35.851401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6567730273281559,\n \"acc_stderr\": 0.03199263283100071,\n \"acc_norm\": 0.6574973933898954,\n \"acc_norm_stderr\": 0.03264096338099939,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7059020061319847,\n \"mc2_stderr\": 0.01499668880054581\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n \"acc_stderr\": 0.004516215206715352,\n \"acc_norm\": 0.8804023102967536,\n \"acc_norm_stderr\": 0.003238273295284749\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n \"acc_stderr\": 0.016650437588269073,\n \"acc_norm\": 0.45363128491620114,\n \"acc_norm_stderr\": 0.016650437588269073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7059020061319847,\n \"mc2_stderr\": 0.01499668880054581\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \"acc_stderr\": 0.013159909755930324\n }\n}\n```", "repo_url": "https://huggingface.co/yam-peleg/Experiment7-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|arc:challenge|25_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|gsm8k|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hellaswag|10_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["**/details_harness|winogrande|5_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T19-25-35.851401.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T19_25_35.851401", "path": ["results_2024-02-11T19-25-35.851401.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T19-25-35.851401.parquet"]}]}]} | 2024-02-11T19:28:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yam-peleg/Experiment7-7B
Dataset automatically created during the evaluation run of model yam-peleg/Experiment7-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T19:25:35.851401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yam-peleg/Experiment7-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment7-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T19:25:35.851401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yam-peleg/Experiment7-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment7-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T19:25:35.851401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1f7b4fef69f04498c5b4b5fedb283de4a7f99009 |
# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B-truthy
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/NeuralBeagle-11B-truthy](https://huggingface.co/vicgalle/NeuralBeagle-11B-truthy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B-truthy",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T19:27:37.984436](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B-truthy/blob/main/results_2024-02-11T19-27-37.984436.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6342999713787871,
"acc_stderr": 0.03262677892134931,
"acc_norm": 0.6370739871356,
"acc_norm_stderr": 0.03328660511823099,
"mc1": 0.620563035495716,
"mc1_stderr": 0.016987039266142968,
"mc2": 0.7592110983636813,
"mc2_stderr": 0.014145118687247313
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.013318528460539419,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.01287592915129705
},
"harness|hellaswag|10": {
"acc": 0.6960764787890859,
"acc_stderr": 0.004590100050198815,
"acc_norm": 0.8786098386775543,
"acc_norm_stderr": 0.003259127057668171
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039931,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039931
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.01274920600765747,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.01274920600765747
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.0283329595140312,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.0283329595140312
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.620563035495716,
"mc1_stderr": 0.016987039266142968,
"mc2": 0.7592110983636813,
"mc2_stderr": 0.014145118687247313
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047989
},
"harness|gsm8k|5": {
"acc": 0.4973464746019712,
"acc_stderr": 0.01377229076885817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B-truthy | [
"region:us"
] | 2024-02-11T19:29:53+00:00 | {"pretty_name": "Evaluation run of vicgalle/NeuralBeagle-11B-truthy", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/NeuralBeagle-11B-truthy](https://huggingface.co/vicgalle/NeuralBeagle-11B-truthy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B-truthy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T19:27:37.984436](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__NeuralBeagle-11B-truthy/blob/main/results_2024-02-11T19-27-37.984436.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6342999713787871,\n \"acc_stderr\": 0.03262677892134931,\n \"acc_norm\": 0.6370739871356,\n \"acc_norm_stderr\": 0.03328660511823099,\n \"mc1\": 0.620563035495716,\n \"mc1_stderr\": 0.016987039266142968,\n \"mc2\": 0.7592110983636813,\n \"mc2_stderr\": 0.014145118687247313\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539419,\n \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.01287592915129705\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6960764787890859,\n \"acc_stderr\": 0.004590100050198815,\n \"acc_norm\": 0.8786098386775543,\n \"acc_norm_stderr\": 0.003259127057668171\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039931,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039931\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.01274920600765747,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.01274920600765747\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.0283329595140312,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.0283329595140312\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.620563035495716,\n \"mc1_stderr\": 0.016987039266142968,\n \"mc2\": 0.7592110983636813,\n \"mc2_stderr\": 0.014145118687247313\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047989\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4973464746019712,\n \"acc_stderr\": 0.01377229076885817\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/NeuralBeagle-11B-truthy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|arc:challenge|25_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|gsm8k|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hellaswag|10_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T19-27-37.984436.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["**/details_harness|winogrande|5_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T19-27-37.984436.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T19_27_37.984436", "path": ["results_2024-02-11T19-27-37.984436.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T19-27-37.984436.parquet"]}]}]} | 2024-02-11T19:30:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B-truthy
Dataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B-truthy on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T19:27:37.984436(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B-truthy\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B-truthy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T19:27:37.984436(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/NeuralBeagle-11B-truthy\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/NeuralBeagle-11B-truthy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T19:27:37.984436(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
07d8d1d540728228e6b697b4a14cfa0896a7f89e |
QA dataset for testing LLM's ability to understand the meaning of Danish idioms. The dataset is based on [Juunge/danske-talemaader](https://huggingface.co/datasets/Juunge/danske-talemaader).
| Juunge/danske-talemaader-QA | [
"region:us"
] | 2024-02-11T19:33:14+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "option_a", "dtype": "string"}, {"name": "option_b", "dtype": "string"}, {"name": "option_c", "dtype": "string"}, {"name": "option_d", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 295127, "num_examples": 1588}], "download_size": 124202, "dataset_size": 295127}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T18:42:39+00:00 | [] | [] | TAGS
#region-us
|
QA dataset for testing LLM's ability to understand the meaning of Danish idioms. The dataset is based on Juunge/danske-talemaader.
| [] | [
"TAGS\n#region-us \n"
] |
b450eef50b7372ac14371cc05a7cc480c014e95d | # Dataset Card for "boolq_transformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | skrishna/boolq_transformed | [
"region:us"
] | 2024-02-11T19:36:10+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "bool"}, {"name": "passage", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12710260, "num_examples": 9427}, {"name": "test", "num_bytes": 4360958, "num_examples": 3270}], "download_size": 10120344, "dataset_size": 17071218}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-12T12:32:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "boolq_transformed"
More Information needed | [
"# Dataset Card for \"boolq_transformed\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"boolq_transformed\"\n\nMore Information needed"
] |
faa2233f37550ff8a0bdecd508d04e5ce978532d | # Dataset Card for "anthology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | cestwc/anthology | [
"region:us"
] | 2024-02-11T19:45:35+00:00 | {"dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "author", "dtype": "string"}, {"name": "year", "dtype": "int64"}, {"name": "abstract", "dtype": "string"}, {"name": "pages", "dtype": "string"}, {"name": "queryID", "dtype": "string"}, {"name": "query", "dtype": "string"}, {"name": "paperID", "dtype": "string"}, {"name": "include", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2533008313, "num_examples": 3370094}], "download_size": 1053579996, "dataset_size": 2533008313}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T08:21:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "anthology"
More Information needed | [
"# Dataset Card for \"anthology\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"anthology\"\n\nMore Information needed"
] |
4e573ec678635027357c6911e237f76dff9873f4 |
# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Hermes-Instruct-7B-v0.2](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T20:18:48.620177](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2/blob/main/results_2024-02-11T20-18-48.620177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6003468006961892,
"acc_stderr": 0.03316229479718134,
"acc_norm": 0.6045151076873472,
"acc_norm_stderr": 0.03383500767913834,
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6101269592915214,
"mc2_stderr": 0.015563902613452118
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650647,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513782
},
"harness|hellaswag|10": {
"acc": 0.6414060944035053,
"acc_stderr": 0.004786075107572187,
"acc_norm": 0.8296156144194383,
"acc_norm_stderr": 0.003752017639083751
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.027807032360686088,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.027807032360686088
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193715,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193715
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.01642167050633919,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.01642167050633919
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073056,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4541003671970624,
"mc1_stderr": 0.017429593091323522,
"mc2": 0.6101269592915214,
"mc2_stderr": 0.015563902613452118
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850506
},
"harness|gsm8k|5": {
"acc": 0.41091736163760423,
"acc_stderr": 0.01355213290142322
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2 | [
"region:us"
] | 2024-02-11T20:21:05+00:00 | {"pretty_name": "Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Hermes-Instruct-7B-v0.2](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T20:18:48.620177](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2/blob/main/results_2024-02-11T20-18-48.620177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6003468006961892,\n \"acc_stderr\": 0.03316229479718134,\n \"acc_norm\": 0.6045151076873472,\n \"acc_norm_stderr\": 0.03383500767913834,\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6101269592915214,\n \"mc2_stderr\": 0.015563902613452118\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650647,\n \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n \"acc_stderr\": 0.004786075107572187,\n \"acc_norm\": 0.8296156144194383,\n \"acc_norm_stderr\": 0.003752017639083751\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193715,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193715\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.01642167050633919,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.01642167050633919\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.027245613047215355,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.027245613047215355\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073056,\n \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073056\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4541003671970624,\n \"mc1_stderr\": 0.017429593091323522,\n \"mc2\": 0.6101269592915214,\n \"mc2_stderr\": 0.015563902613452118\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850506\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41091736163760423,\n \"acc_stderr\": 0.01355213290142322\n }\n}\n```", "repo_url": "https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|arc:challenge|25_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|gsm8k|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hellaswag|10_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T20-18-48.620177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["**/details_harness|winogrande|5_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T20-18-48.620177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T20_18_48.620177", "path": ["results_2024-02-11T20-18-48.620177.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T20-18-48.620177.parquet"]}]}]} | 2024-02-11T20:21:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-v0.2
Dataset automatically created during the evaluation run of model lodrick-the-lafted/Hermes-Instruct-7B-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T20:18:48.620177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model lodrick-the-lafted/Hermes-Instruct-7B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T20:18:48.620177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-v0.2\n\n\n\nDataset automatically created during the evaluation run of model lodrick-the-lafted/Hermes-Instruct-7B-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T20:18:48.620177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
da37b032a59a2e465b2c8520fde1755e6905fdb3 |
# Dataset Card for Evaluation run of eren23/dpo-binarized-NeuralTrix-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T20:45:49.015685](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B/blob/main/results_2024-02-11T20-45-49.015685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6469364528234108,
"acc_stderr": 0.032183894515300515,
"acc_norm": 0.6464632195656521,
"acc_norm_stderr": 0.03285550090176264,
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7906684401427805,
"mc2_stderr": 0.013527182281452275
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.01343890918477876,
"acc_norm": 0.7235494880546075,
"acc_norm_stderr": 0.013069662474252425
},
"harness|hellaswag|10": {
"acc": 0.7118103963353913,
"acc_stderr": 0.004519941716508364,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.0031365472766898884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7906684401427805,
"mc2_stderr": 0.013527182281452275
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.01014194452375004
},
"harness|gsm8k|5": {
"acc": 0.6800606520090978,
"acc_stderr": 0.012848426555240761
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B | [
"region:us"
] | 2024-02-11T20:48:08+00:00 | {"pretty_name": "Evaluation run of eren23/dpo-binarized-NeuralTrix-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/dpo-binarized-NeuralTrix-7B](https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T20:45:49.015685](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeuralTrix-7B/blob/main/results_2024-02-11T20-45-49.015685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6469364528234108,\n \"acc_stderr\": 0.032183894515300515,\n \"acc_norm\": 0.6464632195656521,\n \"acc_norm_stderr\": 0.03285550090176264,\n \"mc1\": 0.6364749082007344,\n \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7906684401427805,\n \"mc2_stderr\": 0.013527182281452275\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.01343890918477876,\n \"acc_norm\": 0.7235494880546075,\n \"acc_norm_stderr\": 0.013069662474252425\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7118103963353913,\n \"acc_stderr\": 0.004519941716508364,\n \"acc_norm\": 0.8888667596096396,\n \"acc_norm_stderr\": 0.0031365472766898884\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6364749082007344,\n \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7906684401427805,\n \"mc2_stderr\": 0.013527182281452275\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.01014194452375004\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6800606520090978,\n \"acc_stderr\": 0.012848426555240761\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/dpo-binarized-NeuralTrix-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|arc:challenge|25_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|gsm8k|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hellaswag|10_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["**/details_harness|winogrande|5_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T20-45-49.015685.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T20_45_49.015685", "path": ["results_2024-02-11T20-45-49.015685.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T20-45-49.015685.parquet"]}]}]} | 2024-02-11T20:48:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of eren23/dpo-binarized-NeuralTrix-7B
Dataset automatically created during the evaluation run of model eren23/dpo-binarized-NeuralTrix-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T20:45:49.015685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of eren23/dpo-binarized-NeuralTrix-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/dpo-binarized-NeuralTrix-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T20:45:49.015685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of eren23/dpo-binarized-NeuralTrix-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/dpo-binarized-NeuralTrix-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T20:45:49.015685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f6b25a74db9d58e4b6ffe27dea2e0124f4cfea05 |
# FractalDB 60
FractalDB 60 dataset from [Pre-training without Natural Images](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/).
[Original repo](https://github.com/hirokatsukataoka16/FractalDB-Pretrained-ResNet-PyTorch) | [Project page](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/) | [arXiv](https://arxiv.org/abs/2101.08515)
## Citing
```bibtex
@article{KataokaIJCV2022,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
article={International Journal on Computer Vision (IJCV)},
year={2022},
}
@inproceedings{KataokaACCV2020,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
booktitle={Asian Conference on Computer Vision (ACCV)},
year={2020},
}
@misc{kataoka2021pretraining,
title={Pre-training without Natural Images},
author={Hirokatsu Kataoka and Kazushige Okayasu and Asato Matsumoto and Eisuke Yamagata and Ryosuke Yamada and Nakamasa Inoue and Akio Nakamura and Yutaka Satoh},
year={2021},
eprint={2101.08515},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | p1atdev/FractalDB-60 | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"license:cc-by-4.0",
"arxiv:2101.08515",
"region:us"
] | 2024-02-11T21:39:24+00:00 | {"license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "pretty_name": "FractalDB 60 ", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "a1", "1": "a2", "2": "a3", "3": "a4", "4": "ammonite", "5": "bamboofern", "6": "bedder", "7": "binary", "8": "branch", "9": "broccoli", "10": "bud", "11": "c_curve", "12": "castle", "13": "cedarleaf", "14": "coral", "15": "crystal", "16": "deerfern", "17": "dragon_curve", "18": "drumlin", "19": "fern", "20": "filmyfern", "21": "fleabane", "22": "flower", "23": "gaku", "24": "ginkgo", "25": "gold_dragon", "26": "grassfern", "27": "greygoldenrod", "28": "groundpine", "29": "involucre", "30": "koch_curve", "31": "koch_snowflake", "32": "maple_leaf", "33": "mcWorter_pedigree", "34": "morningglory", "35": "newyorkfern", "36": "octopuslegs", "37": "penta", "38": "pinetree", "39": "rose", "40": "shieldfern", "41": "sierpinski_carpet", "42": "sierpinski_gasket", "43": "sierpinski_pentagon", "44": "snail", "45": "snowcap", "46": "snowdrift", "47": "spiderbrake", "48": "spiral", "49": "spleenwort_fern", "50": "star", "51": "sticks", "52": "sunflower", "53": "supernova", "54": "swirl", "55": "tree", "56": "turbanshell", "57": "umbrellafern", "58": "watersprite", "59": "zigzag"}}}}], "splits": [{"name": "train", "num_bytes": 3588623140, "num_examples": 60000}], "download_size": 1829671228, "dataset_size": 3588623140}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T00:42:03+00:00 | [
"2101.08515"
] | [] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #license-cc-by-4.0 #arxiv-2101.08515 #region-us
|
# FractalDB 60
FractalDB 60 dataset from Pre-training without Natural Images.
Original repo | Project page | arXiv
## Citing
| [
"# FractalDB 60\n\nFractalDB 60 dataset from Pre-training without Natural Images.\n\nOriginal repo | Project page | arXiv",
"## Citing"
] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-cc-by-4.0 #arxiv-2101.08515 #region-us \n",
"# FractalDB 60\n\nFractalDB 60 dataset from Pre-training without Natural Images.\n\nOriginal repo | Project page | arXiv",
"## Citing"
] |
c92ddba878f8695ac5d78d9320fe2edbe5c413d4 |
# Dataset Card for Evaluation run of Radu1999/MisterUkrainianDPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/MisterUkrainianDPO](https://huggingface.co/Radu1999/MisterUkrainianDPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__MisterUkrainianDPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T21:42:01.189462](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__MisterUkrainianDPO/blob/main/results_2024-02-11T21-42-01.189462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6324628678862224,
"acc_stderr": 0.03270388322615121,
"acc_norm": 0.6341483790631317,
"acc_norm_stderr": 0.033366074107875524,
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.7017912002538903,
"mc2_stderr": 0.014627069760640677
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068079
},
"harness|hellaswag|10": {
"acc": 0.676458872734515,
"acc_stderr": 0.004668710689192403,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.003380641470989923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337152,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931666,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931666
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427044,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427044
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897224,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.034611994290400135,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.034611994290400135
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160875,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160875
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5422276621787026,
"mc1_stderr": 0.017440965712482125,
"mc2": 0.7017912002538903,
"mc2_stderr": 0.014627069760640677
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.01108253884749191
},
"harness|gsm8k|5": {
"acc": 0.5928733889310084,
"acc_stderr": 0.013532811069356535
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Radu1999__MisterUkrainianDPO | [
"region:us"
] | 2024-02-11T21:44:23+00:00 | {"pretty_name": "Evaluation run of Radu1999/MisterUkrainianDPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/MisterUkrainianDPO](https://huggingface.co/Radu1999/MisterUkrainianDPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__MisterUkrainianDPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T21:42:01.189462](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__MisterUkrainianDPO/blob/main/results_2024-02-11T21-42-01.189462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324628678862224,\n \"acc_stderr\": 0.03270388322615121,\n \"acc_norm\": 0.6341483790631317,\n \"acc_norm_stderr\": 0.033366074107875524,\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.7017912002538903,\n \"mc2_stderr\": 0.014627069760640677\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068079\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676458872734515,\n \"acc_stderr\": 0.004668710689192403,\n \"acc_norm\": 0.867755427205736,\n \"acc_norm_stderr\": 0.003380641470989923\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337152,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337152\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931666,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931666\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427044,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427044\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897224,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897224\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n \"acc_stderr\": 0.034611994290400135,\n \"acc_norm\": 0.6019900497512438,\n \"acc_norm_stderr\": 0.034611994290400135\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160875,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160875\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.7017912002538903,\n \"mc2_stderr\": 0.014627069760640677\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.01108253884749191\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5928733889310084,\n \"acc_stderr\": 0.013532811069356535\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/MisterUkrainianDPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|arc:challenge|25_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|gsm8k|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hellaswag|10_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T21-42-01.189462.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["**/details_harness|winogrande|5_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T21-42-01.189462.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T21_42_01.189462", "path": ["results_2024-02-11T21-42-01.189462.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T21-42-01.189462.parquet"]}]}]} | 2024-02-11T21:44:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Radu1999/MisterUkrainianDPO
Dataset automatically created during the evaluation run of model Radu1999/MisterUkrainianDPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T21:42:01.189462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Radu1999/MisterUkrainianDPO\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/MisterUkrainianDPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T21:42:01.189462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Radu1999/MisterUkrainianDPO\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/MisterUkrainianDPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T21:42:01.189462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8d553a34ec20e62d7ae64e8d23b0047c6a858394 |
## Sources of the corpus used
- [Opus](https://opus.nlpl.eu/results/en&eu/corpus-result-table)
- [Orai](https://www.orai.eus/en/resources)
| xezpeleta/parallel-basque-corpus | [
"task_categories:translation",
"language:eu",
"language:en",
"language:es",
"language:fr",
"region:us"
] | 2024-02-11T22:03:26+00:00 | {"language": ["eu", "en", "es", "fr"], "task_categories": ["translation"], "pretty_name": "Parallel Basque Corpus"} | 2024-02-12T12:15:37+00:00 | [] | [
"eu",
"en",
"es",
"fr"
] | TAGS
#task_categories-translation #language-Basque #language-English #language-Spanish #language-French #region-us
|
## Sources of the corpus used
- Opus
- Orai
| [
"## Sources of the corpus used\n\n- Opus\n- Orai"
] | [
"TAGS\n#task_categories-translation #language-Basque #language-English #language-Spanish #language-French #region-us \n",
"## Sources of the corpus used\n\n- Opus\n- Orai"
] |
fe1278ff9c7804dcc805bf8c68ce8943718568d8 |
# Dataset Card for Evaluation run of yam-peleg/Experiment8-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment8-7B](https://huggingface.co/yam-peleg/Experiment8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment8-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T22:11:17.659177](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment8-7B/blob/main/results_2024-02-11T22-11-17.659177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6566450072812023,
"acc_stderr": 0.0320051313605246,
"acc_norm": 0.6575118910656207,
"acc_norm_stderr": 0.032652021481169954,
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.7024529326790527,
"mc2_stderr": 0.015023401127362113
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068745,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601334
},
"harness|hellaswag|10": {
"acc": 0.7120095598486357,
"acc_stderr": 0.004519011688417168,
"acc_norm": 0.8812985461063533,
"acc_norm_stderr": 0.0032277587155455987
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125383,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125383
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4569832402234637,
"acc_stderr": 0.01666049858050917,
"acc_norm": 0.4569832402234637,
"acc_norm_stderr": 0.01666049858050917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083138,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5605875152998776,
"mc1_stderr": 0.017374520482513704,
"mc2": 0.7024529326790527,
"mc2_stderr": 0.015023401127362113
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.01109979664592053
},
"harness|gsm8k|5": {
"acc": 0.6444275966641395,
"acc_stderr": 0.013185402252713852
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yam-peleg__Experiment8-7B | [
"region:us"
] | 2024-02-11T22:13:33+00:00 | {"pretty_name": "Evaluation run of yam-peleg/Experiment8-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yam-peleg/Experiment8-7B](https://huggingface.co/yam-peleg/Experiment8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment8-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T22:11:17.659177](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment8-7B/blob/main/results_2024-02-11T22-11-17.659177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6566450072812023,\n \"acc_stderr\": 0.0320051313605246,\n \"acc_norm\": 0.6575118910656207,\n \"acc_norm_stderr\": 0.032652021481169954,\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.7024529326790527,\n \"mc2_stderr\": 0.015023401127362113\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068745,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601334\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7120095598486357,\n \"acc_stderr\": 0.004519011688417168,\n \"acc_norm\": 0.8812985461063533,\n \"acc_norm_stderr\": 0.0032277587155455987\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125383,\n \"acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125383\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4569832402234637,\n \"acc_stderr\": 0.01666049858050917,\n \"acc_norm\": 0.4569832402234637,\n \"acc_norm_stderr\": 0.01666049858050917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5605875152998776,\n \"mc1_stderr\": 0.017374520482513704,\n \"mc2\": 0.7024529326790527,\n \"mc2_stderr\": 0.015023401127362113\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.01109979664592053\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6444275966641395,\n \"acc_stderr\": 0.013185402252713852\n }\n}\n```", "repo_url": "https://huggingface.co/yam-peleg/Experiment8-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-11-17.659177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["**/details_harness|winogrande|5_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T22-11-17.659177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T22_11_17.659177", "path": ["results_2024-02-11T22-11-17.659177.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T22-11-17.659177.parquet"]}]}]} | 2024-02-11T22:13:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yam-peleg/Experiment8-7B
Dataset automatically created during the evaluation run of model yam-peleg/Experiment8-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T22:11:17.659177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yam-peleg/Experiment8-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment8-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:11:17.659177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yam-peleg/Experiment8-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment8-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:11:17.659177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
544238c8d7f1be3d78cc961c7dca302f8f0a8800 |
# Dataset Card for Evaluation run of max-2022/test_mistral2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [max-2022/test_mistral2](https://huggingface.co/max-2022/test_mistral2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_max-2022__test_mistral2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T22:17:01.815383](https://huggingface.co/datasets/open-llm-leaderboard/details_max-2022__test_mistral2/blob/main/results_2024-02-11T22-17-01.815383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24710575900369156,
"acc_stderr": 0.03055560450787355,
"acc_norm": 0.24800687316365916,
"acc_norm_stderr": 0.0313664960522948,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.490951583190258,
"mc2_stderr": 0.016977888460336696
},
"harness|arc:challenge|25": {
"acc": 0.23037542662116042,
"acc_stderr": 0.01230492841874761,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601338
},
"harness|hellaswag|10": {
"acc": 0.2575184226249751,
"acc_stderr": 0.004363736410689625,
"acc_norm": 0.25323640709022105,
"acc_norm_stderr": 0.004339764434219064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.0261998088075619,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.0261998088075619
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994106,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16161616161616163,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.16161616161616163,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371215,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958934,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402543,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402543
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529627,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807113,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807113
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303665,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303665
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02380518652488815,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02380518652488815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886324,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307847,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307847
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443737,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904052,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904052
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017176,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017176
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.490951583190258,
"mc2_stderr": 0.016977888460336696
},
"harness|winogrande|5": {
"acc": 0.48539857932123126,
"acc_stderr": 0.014046492383275834
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_max-2022__test_mistral2 | [
"region:us"
] | 2024-02-11T22:19:19+00:00 | {"pretty_name": "Evaluation run of max-2022/test_mistral2", "dataset_summary": "Dataset automatically created during the evaluation run of model [max-2022/test_mistral2](https://huggingface.co/max-2022/test_mistral2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_max-2022__test_mistral2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T22:17:01.815383](https://huggingface.co/datasets/open-llm-leaderboard/details_max-2022__test_mistral2/blob/main/results_2024-02-11T22-17-01.815383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24710575900369156,\n \"acc_stderr\": 0.03055560450787355,\n \"acc_norm\": 0.24800687316365916,\n \"acc_norm_stderr\": 0.0313664960522948,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.490951583190258,\n \"mc2_stderr\": 0.016977888460336696\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23037542662116042,\n \"acc_stderr\": 0.01230492841874761,\n \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601338\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2575184226249751,\n \"acc_stderr\": 0.004363736410689625,\n \"acc_norm\": 0.25323640709022105,\n \"acc_norm_stderr\": 0.004339764434219064\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.0261998088075619,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.0261998088075619\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.03068302084323101,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.03068302084323101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994106,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.16161616161616163,\n \"acc_stderr\": 0.026225919863629283,\n \"acc_norm\": 0.16161616161616163,\n \"acc_norm_stderr\": 0.026225919863629283\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371215,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958934,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402543,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402543\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.3094170403587444,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.21487603305785125,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529627,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529627\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807113,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807113\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303665,\n \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303665\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02380518652488815,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02380518652488815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.22508038585209003,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886324,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886324\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307847,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307847\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904052,\n \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904052\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017176,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017176\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.490951583190258,\n \"mc2_stderr\": 0.016977888460336696\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.014046492383275834\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/max-2022/test_mistral2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["**/details_harness|winogrande|5_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T22-17-01.815383.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T22_17_01.815383", "path": ["results_2024-02-11T22-17-01.815383.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T22-17-01.815383.parquet"]}]}]} | 2024-02-11T22:19:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of max-2022/test_mistral2
Dataset automatically created during the evaluation run of model max-2022/test_mistral2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T22:17:01.815383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of max-2022/test_mistral2\n\n\n\nDataset automatically created during the evaluation run of model max-2022/test_mistral2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:17:01.815383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of max-2022/test_mistral2\n\n\n\nDataset automatically created during the evaluation run of model max-2022/test_mistral2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:17:01.815383(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6335a7cf8e8c57ff609f869a7f97306f696a1b29 |
# FractalDB 1k
FractalDB 1k dataset from [Pre-training without Natural Images](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/).
[Original repo](https://github.com/hirokatsukataoka16/FractalDB-Pretrained-ResNet-PyTorch) | [Project page](https://hirokatsukataoka16.github.io/Pretraining-without-Natural-Images/) | [arXiv](https://arxiv.org/abs/2101.08515)
## Citing
```bibtex
@article{KataokaIJCV2022,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
article={International Journal on Computer Vision (IJCV)},
year={2022},
}
@inproceedings{KataokaACCV2020,
author={Kataoka, Hirokatsu and Okayasu, Kazushige and Matsumoto, Asato and Yamagata, Eisuke and Yamada, Ryosuke and Inoue, Nakamasa and Nakamura, Akio and Satoh, Yutaka},
title={Pre-training without Natural Images},
booktitle={Asian Conference on Computer Vision (ACCV)},
year={2020},
}
@misc{kataoka2021pretraining,
title={Pre-training without Natural Images},
author={Hirokatsu Kataoka and Kazushige Okayasu and Asato Matsumoto and Eisuke Yamagata and Ryosuke Yamada and Nakamasa Inoue and Akio Nakamura and Yutaka Satoh},
year={2021},
eprint={2101.08515},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | p1atdev/FractalDB-1k | [
"task_categories:image-classification",
"size_categories:10M<n<100M",
"license:cc-by-4.0",
"arxiv:2101.08515",
"region:us"
] | 2024-02-11T22:51:25+00:00 | {"license": "cc-by-4.0", "size_categories": ["10M<n<100M"], "task_categories": ["image-classification"], "pretty_name": "FractalDB 1k", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "00000", "1": "00001", "2": "00002", "3": "00003", "4": "00004", "5": "00005", "6": "00006", "7": "00007", "8": 8, "9": 9, "10": "00010", "11": "00011", "12": "00012", "13": "00013", "14": "00014", "15": "00015", "16": "00016", "17": "00017", "18": 18, "19": 19, "20": "00020", "21": "00021", "22": "00022", "23": "00023", "24": "00024", "25": "00025", "26": "00026", "27": "00027", "28": 28, "29": 29, "30": "00030", "31": "00031", "32": "00032", "33": "00033", "34": "00034", "35": "00035", "36": "00036", "37": "00037", "38": 38, "39": 39, "40": "00040", "41": "00041", "42": "00042", "43": "00043", "44": "00044", "45": "00045", "46": "00046", "47": "00047", "48": 48, "49": 49, "50": "00050", "51": "00051", "52": "00052", "53": "00053", "54": "00054", "55": "00055", "56": "00056", "57": "00057", "58": 58, "59": 59, "60": "00060", "61": "00061", "62": "00062", "63": "00063", "64": "00064", "65": "00065", "66": "00066", "67": "00067", "68": 68, "69": 69, "70": "00070", "71": "00071", "72": "00072", "73": "00073", "74": "00074", "75": "00075", "76": "00076", "77": "00077", "78": 78, "79": 79, "80": 80, "81": 81, "82": 82, "83": 83, "84": 84, "85": 85, "86": 86, "87": 87, "88": 88, "89": 89, "90": 90, "91": 91, "92": 92, "93": 93, "94": 94, "95": 95, "96": 96, "97": 97, "98": 98, "99": 99, "100": "00100", "101": "00101", "102": "00102", "103": "00103", "104": "00104", "105": "00105", "106": "00106", "107": "00107", "108": 108, "109": 109, "110": "00110", "111": "00111", "112": "00112", "113": "00113", "114": "00114", "115": "00115", "116": "00116", "117": "00117", "118": 118, "119": 119, "120": "00120", "121": "00121", "122": "00122", "123": "00123", "124": "00124", "125": "00125", "126": "00126", "127": "00127", "128": 128, "129": 129, "130": "00130", "131": "00131", "132": "00132", "133": "00133", "134": "00134", "135": "00135", "136": "00136", "137": "00137", "138": 138, "139": 139, "140": "00140", "141": "00141", "142": "00142", "143": "00143", "144": "00144", "145": "00145", "146": "00146", "147": "00147", "148": 148, "149": 149, "150": "00150", "151": "00151", "152": "00152", "153": "00153", "154": "00154", "155": "00155", "156": "00156", "157": "00157", "158": 158, "159": 159, "160": "00160", "161": "00161", "162": "00162", "163": "00163", "164": "00164", "165": "00165", "166": "00166", "167": "00167", "168": 168, "169": 169, "170": "00170", "171": "00171", "172": "00172", "173": "00173", "174": "00174", "175": "00175", "176": "00176", "177": "00177", "178": 178, "179": 179, "180": 180, "181": 181, "182": 182, "183": 183, "184": 184, "185": 185, "186": 186, "187": 187, "188": 188, "189": 189, "190": 190, "191": 191, "192": 192, "193": 193, "194": 194, "195": 195, "196": 196, "197": 197, "198": 198, "199": 199, "200": "00200", "201": "00201", "202": "00202", "203": "00203", "204": "00204", "205": "00205", "206": "00206", "207": "00207", "208": 208, "209": 209, "210": "00210", "211": "00211", "212": "00212", "213": "00213", "214": "00214", "215": "00215", "216": "00216", "217": "00217", "218": 218, "219": 219, "220": "00220", "221": "00221", "222": "00222", "223": "00223", "224": "00224", "225": "00225", "226": "00226", "227": "00227", "228": 228, "229": 229, "230": "00230", "231": "00231", "232": "00232", "233": "00233", "234": "00234", "235": "00235", "236": "00236", "237": "00237", "238": 238, "239": 239, "240": "00240", "241": "00241", "242": "00242", "243": "00243", "244": "00244", "245": "00245", "246": "00246", "247": "00247", "248": 248, "249": 249, "250": "00250", "251": "00251", "252": "00252", "253": "00253", "254": "00254", "255": "00255", "256": "00256", "257": "00257", "258": 258, "259": 259, "260": "00260", "261": "00261", "262": "00262", "263": "00263", "264": "00264", "265": "00265", "266": "00266", "267": "00267", "268": 268, "269": 269, "270": "00270", "271": "00271", "272": "00272", "273": "00273", "274": "00274", "275": "00275", "276": "00276", "277": "00277", "278": 278, "279": 279, "280": 280, "281": 281, "282": 282, "283": 283, "284": 284, "285": 285, "286": 286, "287": 287, "288": 288, "289": 289, "290": 290, "291": 291, "292": 292, "293": 293, "294": 294, "295": 295, "296": 296, "297": 297, "298": 298, "299": 299, "300": "00300", "301": "00301", "302": "00302", "303": "00303", "304": "00304", "305": "00305", "306": "00306", "307": "00307", "308": 308, "309": 309, "310": "00310", "311": "00311", "312": "00312", "313": "00313", "314": "00314", "315": "00315", "316": "00316", "317": "00317", "318": 318, "319": 319, "320": "00320", "321": "00321", "322": "00322", "323": "00323", "324": "00324", "325": "00325", "326": "00326", "327": "00327", "328": 328, "329": 329, "330": "00330", "331": "00331", "332": "00332", "333": "00333", "334": "00334", "335": "00335", "336": "00336", "337": "00337", "338": 338, "339": 339, "340": "00340", "341": "00341", "342": "00342", "343": "00343", "344": "00344", "345": "00345", "346": "00346", "347": "00347", "348": 348, "349": 349, "350": "00350", "351": "00351", "352": "00352", "353": "00353", "354": "00354", "355": "00355", "356": "00356", "357": "00357", "358": 358, "359": 359, "360": "00360", "361": "00361", "362": "00362", "363": "00363", "364": "00364", "365": "00365", "366": "00366", "367": "00367", "368": 368, "369": 369, "370": "00370", "371": "00371", "372": "00372", "373": "00373", "374": "00374", "375": "00375", "376": "00376", "377": "00377", "378": 378, "379": 379, "380": 380, "381": 381, "382": 382, "383": 383, "384": 384, "385": 385, "386": 386, "387": 387, "388": 388, "389": 389, "390": 390, "391": 391, "392": 392, "393": 393, "394": 394, "395": 395, "396": 396, "397": 397, "398": 398, "399": 399, "400": "00400", "401": "00401", "402": "00402", "403": "00403", "404": "00404", "405": "00405", "406": "00406", "407": "00407", "408": 408, "409": 409, "410": "00410", "411": "00411", "412": "00412", "413": "00413", "414": "00414", "415": "00415", "416": "00416", "417": "00417", "418": 418, "419": 419, "420": "00420", "421": "00421", "422": "00422", "423": "00423", "424": "00424", "425": "00425", "426": "00426", "427": "00427", "428": 428, "429": 429, "430": "00430", "431": "00431", "432": "00432", "433": "00433", "434": "00434", "435": "00435", "436": "00436", "437": "00437", "438": 438, "439": 439, "440": "00440", "441": "00441", "442": "00442", "443": "00443", "444": "00444", "445": "00445", "446": "00446", "447": "00447", "448": 448, "449": 449, "450": "00450", "451": "00451", "452": "00452", "453": "00453", "454": "00454", "455": "00455", "456": "00456", "457": "00457", "458": 458, "459": 459, "460": "00460", "461": "00461", "462": "00462", "463": "00463", "464": "00464", "465": "00465", "466": "00466", "467": "00467", "468": 468, "469": 469, "470": "00470", "471": "00471", "472": "00472", "473": "00473", "474": "00474", "475": "00475", "476": "00476", "477": "00477", "478": 478, "479": 479, "480": 480, "481": 481, "482": 482, "483": 483, "484": 484, "485": 485, "486": 486, "487": 487, "488": 488, "489": 489, "490": 490, "491": 491, "492": 492, "493": 493, "494": 494, "495": 495, "496": 496, "497": 497, "498": 498, "499": 499, "500": "00500", "501": "00501", "502": "00502", "503": "00503", "504": "00504", "505": "00505", "506": "00506", "507": "00507", "508": 508, "509": 509, "510": "00510", "511": "00511", "512": "00512", "513": "00513", "514": "00514", "515": "00515", "516": "00516", "517": "00517", "518": 518, "519": 519, "520": "00520", "521": "00521", "522": "00522", "523": "00523", "524": "00524", "525": "00525", "526": "00526", "527": "00527", "528": 528, "529": 529, "530": "00530", "531": "00531", "532": "00532", "533": "00533", "534": "00534", "535": "00535", "536": "00536", "537": "00537", "538": 538, "539": 539, "540": "00540", "541": "00541", "542": "00542", "543": "00543", "544": "00544", "545": "00545", "546": "00546", "547": "00547", "548": 548, "549": 549, "550": "00550", "551": "00551", "552": "00552", "553": "00553", "554": "00554", "555": "00555", "556": "00556", "557": "00557", "558": 558, "559": 559, "560": "00560", "561": "00561", "562": "00562", "563": "00563", "564": "00564", "565": "00565", "566": "00566", "567": "00567", "568": 568, "569": 569, "570": "00570", "571": "00571", "572": "00572", "573": "00573", "574": "00574", "575": "00575", "576": "00576", "577": "00577", "578": 578, "579": 579, "580": 580, "581": 581, "582": 582, "583": 583, "584": 584, "585": 585, "586": 586, "587": 587, "588": 588, "589": 589, "590": 590, "591": 591, "592": 592, "593": 593, "594": 594, "595": 595, "596": 596, "597": 597, "598": 598, "599": 599, "600": "00600", "601": "00601", "602": "00602", "603": "00603", "604": "00604", "605": "00605", "606": "00606", "607": "00607", "608": 608, "609": 609, "610": "00610", "611": "00611", "612": "00612", "613": "00613", "614": "00614", "615": "00615", "616": "00616", "617": "00617", "618": 618, "619": 619, "620": "00620", "621": "00621", "622": "00622", "623": "00623", "624": "00624", "625": "00625", "626": "00626", "627": "00627", "628": 628, "629": 629, "630": "00630", "631": "00631", "632": "00632", "633": "00633", "634": "00634", "635": "00635", "636": "00636", "637": "00637", "638": 638, "639": 639, "640": "00640", "641": "00641", "642": "00642", "643": "00643", "644": "00644", "645": "00645", "646": "00646", "647": "00647", "648": 648, "649": 649, "650": "00650", "651": "00651", "652": "00652", "653": "00653", "654": "00654", "655": "00655", "656": "00656", "657": "00657", "658": 658, "659": 659, "660": "00660", "661": "00661", "662": "00662", "663": "00663", "664": "00664", "665": "00665", "666": "00666", "667": "00667", "668": 668, "669": 669, "670": "00670", "671": "00671", "672": "00672", "673": "00673", "674": "00674", "675": "00675", "676": "00676", "677": "00677", "678": 678, "679": 679, "680": 680, "681": 681, "682": 682, "683": 683, "684": 684, "685": 685, "686": 686, "687": 687, "688": 688, "689": 689, "690": 690, "691": 691, "692": 692, "693": 693, "694": 694, "695": 695, "696": 696, "697": 697, "698": 698, "699": 699, "700": "00700", "701": "00701", "702": "00702", "703": "00703", "704": "00704", "705": "00705", "706": "00706", "707": "00707", "708": 708, "709": 709, "710": "00710", "711": "00711", "712": "00712", "713": "00713", "714": "00714", "715": "00715", "716": "00716", "717": "00717", "718": 718, "719": 719, "720": "00720", "721": "00721", "722": "00722", "723": "00723", "724": "00724", "725": "00725", "726": "00726", "727": "00727", "728": 728, "729": 729, "730": "00730", "731": "00731", "732": "00732", "733": "00733", "734": "00734", "735": "00735", "736": "00736", "737": "00737", "738": 738, "739": 739, "740": "00740", "741": "00741", "742": "00742", "743": "00743", "744": "00744", "745": "00745", "746": "00746", "747": "00747", "748": 748, "749": 749, "750": "00750", "751": "00751", "752": "00752", "753": "00753", "754": "00754", "755": "00755", "756": "00756", "757": "00757", "758": 758, "759": 759, "760": "00760", "761": "00761", "762": "00762", "763": "00763", "764": "00764", "765": "00765", "766": "00766", "767": "00767", "768": 768, "769": 769, "770": "00770", "771": "00771", "772": "00772", "773": "00773", "774": "00774", "775": "00775", "776": "00776", "777": "00777", "778": 778, "779": 779, "780": 780, "781": 781, "782": 782, "783": 783, "784": 784, "785": 785, "786": 786, "787": 787, "788": 788, "789": 789, "790": 790, "791": 791, "792": 792, "793": 793, "794": 794, "795": 795, "796": 796, "797": 797, "798": 798, "799": 799, "800": 800, "801": 801, "802": 802, "803": 803, "804": 804, "805": 805, "806": 806, "807": 807, "808": 808, "809": 809, "810": 810, "811": 811, "812": 812, "813": 813, "814": 814, "815": 815, "816": 816, "817": 817, "818": 818, "819": 819, "820": 820, "821": 821, "822": 822, "823": 823, "824": 824, "825": 825, "826": 826, "827": 827, "828": 828, "829": 829, "830": 830, "831": 831, "832": 832, "833": 833, "834": 834, "835": 835, "836": 836, "837": 837, "838": 838, "839": 839, "840": 840, "841": 841, "842": 842, "843": 843, "844": 844, "845": 845, "846": 846, "847": 847, "848": 848, "849": 849, "850": 850, "851": 851, "852": 852, "853": 853, "854": 854, "855": 855, "856": 856, "857": 857, "858": 858, "859": 859, "860": 860, "861": 861, "862": 862, "863": 863, "864": 864, "865": 865, "866": 866, "867": 867, "868": 868, "869": 869, "870": 870, "871": 871, "872": 872, "873": 873, "874": 874, "875": 875, "876": 876, "877": 877, "878": 878, "879": 879, "880": 880, "881": 881, "882": 882, "883": 883, "884": 884, "885": 885, "886": 886, "887": 887, "888": 888, "889": 889, "890": 890, "891": 891, "892": 892, "893": 893, "894": 894, "895": 895, "896": 896, "897": 897, "898": 898, "899": 899, "900": 900, "901": 901, "902": 902, "903": 903, "904": 904, "905": 905, "906": 906, "907": 907, "908": 908, "909": 909, "910": 910, "911": 911, "912": 912, "913": 913, "914": 914, "915": 915, "916": 916, "917": 917, "918": 918, "919": 919, "920": 920, "921": 921, "922": 922, "923": 923, "924": 924, "925": 925, "926": 926, "927": 927, "928": 928, "929": 929, "930": 930, "931": 931, "932": 932, "933": 933, "934": 934, "935": 935, "936": 936, "937": 937, "938": 938, "939": 939, "940": 940, "941": 941, "942": 942, "943": 943, "944": 944, "945": 945, "946": 946, "947": 947, "948": 948, "949": 949, "950": 950, "951": 951, "952": 952, "953": 953, "954": 954, "955": 955, "956": 956, "957": 957, "958": 958, "959": 959, "960": 960, "961": 961, "962": 962, "963": 963, "964": 964, "965": 965, "966": 966, "967": 967, "968": 968, "969": 969, "970": 970, "971": 971, "972": 972, "973": 973, "974": 974, "975": 975, "976": 976, "977": 977, "978": 978, "979": 979, "980": 980, "981": 981, "982": 982, "983": 983, "984": 984, "985": 985, "986": 986, "987": 987, "988": 988, "989": 989, "990": 990, "991": 991, "992": 992, "993": 993, "994": 994, "995": 995, "996": 996, "997": 997, "998": 998, "999": 999}}}}], "splits": [{"name": "train", "num_bytes": 13905951000, "num_examples": 1000000}], "download_size": 14091259942, "dataset_size": 13905951000}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T00:42:17+00:00 | [
"2101.08515"
] | [] | TAGS
#task_categories-image-classification #size_categories-10M<n<100M #license-cc-by-4.0 #arxiv-2101.08515 #region-us
|
# FractalDB 1k
FractalDB 1k dataset from Pre-training without Natural Images.
Original repo | Project page | arXiv
## Citing
| [
"# FractalDB 1k\n\nFractalDB 1k dataset from Pre-training without Natural Images.\n\nOriginal repo | Project page | arXiv",
"## Citing"
] | [
"TAGS\n#task_categories-image-classification #size_categories-10M<n<100M #license-cc-by-4.0 #arxiv-2101.08515 #region-us \n",
"# FractalDB 1k\n\nFractalDB 1k dataset from Pre-training without Natural Images.\n\nOriginal repo | Project page | arXiv",
"## Citing"
] |
1b7677b12055bbd5906f1ab40f24b12a6f788d6c |
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-4x7b-v4](https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T22:50:28.150715](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v4/blob/main/results_2024-02-11T22-50-28.150715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6518259020068343,
"acc_stderr": 0.03208915192076506,
"acc_norm": 0.6507829553474964,
"acc_norm_stderr": 0.03276672897336205,
"mc1": 0.598531211750306,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.752961276438783,
"mc2_stderr": 0.01417311411412302
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.013340916085246256,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7131049591714798,
"acc_stderr": 0.004513877465062107,
"acc_norm": 0.8884684325831508,
"acc_norm_stderr": 0.0031414591751392717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834845,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834845
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580428,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101006,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101006
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.598531211750306,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.752961276438783,
"mc2_stderr": 0.01417311411412302
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337699
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v4 | [
"region:us"
] | 2024-02-11T22:52:47+00:00 | {"pretty_name": "Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-4x7b-v4](https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T22:50:28.150715](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-4x7b-v4/blob/main/results_2024-02-11T22-50-28.150715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6518259020068343,\n \"acc_stderr\": 0.03208915192076506,\n \"acc_norm\": 0.6507829553474964,\n \"acc_norm_stderr\": 0.03276672897336205,\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.752961276438783,\n \"mc2_stderr\": 0.01417311411412302\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246256,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7131049591714798,\n \"acc_stderr\": 0.004513877465062107,\n \"acc_norm\": 0.8884684325831508,\n \"acc_norm_stderr\": 0.0031414591751392717\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.0154808268653743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834845,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834845\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101006,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101006\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.752961276438783,\n \"mc2_stderr\": 0.01417311411412302\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \"acc_stderr\": 0.012454841668337699\n }\n}\n```", "repo_url": "https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T22-50-28.150715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["**/details_harness|winogrande|5_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T22-50-28.150715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T22_50_28.150715", "path": ["results_2024-02-11T22-50-28.150715.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T22-50-28.150715.parquet"]}]}]} | 2024-02-11T22:53:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v4
Dataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-4x7b-v4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T22:50:28.150715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v4\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-4x7b-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:50:28.150715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-4x7b-v4\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MixtureofMerges-MoE-4x7b-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T22:50:28.150715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4627efae6bd2ddcebb8acac00d513ffd8e00775c |
# OpenMathInstruct-1
OpenMathInstruct-1 is a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
The problems are from [GSM8K](https://github.com/openai/grade-school-math)
and [MATH](https://github.com/hendrycks/math) training subsets and the solutions
are synthetically generated by allowing Mixtral model to use a mix of text reasoning and
code blocks executed by Python interpreter.
The dataset is split into train and validation subsets that we used in the ablations experiments.
These two subsets combined together cover the full training set of GSM8K and MATH.
OpenMathInstruct-1 dataset contains of the following fields:
- **question**: original question from either GSM8K or MATH training set.
- **generated_solution**: the synthetically generated solution that uses a mix of text reasoning and code blocks.
- **expected_answer**: the ground-truth answer provided in the original dataset.
- **predicted_answer**: the answer predicted by Mixtral model in the corresponding solution (extracted from `\boxed{}`).
- **error_message**: `<not_executed>` if code was not used. Otherwise it's empty or contains a Python exception
from the corresponding code block. A `timeout` string indicates that code block took longer than 10 seconds to
execute. In the current dataset version we always stop generation after any error or a timeout.
- **is_correct**: whether the final answer was considered correct by our grading script.
- **dataset**: gsm8k or math.
- **generation_type**: `without_reference_solution` or `masked_reference_solution`.
We also release the masked solutions used to produce `generation_type="masked_reference_solution"`
portion of the dataset ([GSM8K-Masked](https://huggingface.co/datasets/nvidia/OpenMath-GSM8K-masked),
[MATH-Masked](https://huggingface.co/datasets/nvidia/OpenMath-MATH-masked)).
See our [paper](https://arxiv.org/abs/2402.10176) to learn more details!
## OpenMath models
To demonstrate the quality of this dataset, we release a series of OpenMath models
trained on this data (a combination of train and validation splits to allow comparison with prior work).
<table border="1">
<tr>
<td></td>
<td colspan="2" style="text-align: center;">greedy</td>
<td colspan="2" style="text-align: center;">majority@50</td>
</tr>
<tr>
<td style="text-align: center;">model</td>
<td style="text-align: center;">GSM8K</td>
<td style="text-align: center;">MATH</td>
<td style="text-align: center;">GMS8K</td>
<td style="text-align: center;">MATH</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-7B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-7b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-7b-Python-hf">HF</a>)</td>
<td style="text-align: center;">75.9</td>
<td style="text-align: center;">43.6</td>
<td style="text-align: center;">84.8</td>
<td style="text-align: center;">55.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Mistral-7B (<a href="https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1-hf">HF</a>)</td>
<td style="text-align: center;">80.2</td>
<td style="text-align: center;">44.5</td>
<td style="text-align: center;">86.9</td>
<td style="text-align: center;">57.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-13B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-13b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-13b-Python-hf">HF</a>)</td>
<td style="text-align: center;">78.8</td>
<td style="text-align: center;">45.5</td>
<td style="text-align: center;">86.8</td>
<td style="text-align: center;">57.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-34B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-34b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-34b-Python-hf">HF</a>)</td>
<td style="text-align: center;">80.7</td>
<td style="text-align: center;">48.3</td>
<td style="text-align: center;">88.0</td>
<td style="text-align: center;">60.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Llama2-70B (<a href="https://huggingface.co/nvidia/OpenMath-Llama-2-70b">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-Llama-2-70b-hf">HF</a>)</td>
<td style="text-align: center;"><b>84.7</b></td>
<td style="text-align: center;">46.3</td>
<td style="text-align: center;">90.1</td>
<td style="text-align: center;">58.3</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-70B (<a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-70b-Python">nemo</a> | <a href="https://huggingface.co/nvidia/OpenMath-CodeLlama-70b-Python-hf">HF</a>)</td>
<td style="text-align: center;">84.6</td>
<td style="text-align: center;"><b>50.7</b></td>
<td style="text-align: center;"><b>90.8</b></td>
<td style="text-align: center;"><b>60.4</b></td>
</tr>
</table>
The pipeline we used to produce the data and models is fully open-sourced!
- [Code](https://github.com/Kipok/NeMo-Skills)
- [Models](https://huggingface.co/collections/nvidia/openmath-65c5619de2ba059be0775014)
- [Dataset](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1)
## Reproducing our results
We provide [all instructions](https://github.com/Kipok/NeMo-Skills/blob/main/docs/reproducing-results.md)
to fully reproduce our results, including data generation.
## Generating similar datasets
To generate similar datasets for other tasks or to learn more about our code, read through the docs below.
- [NeMo-Skills Pipeline](https://github.com/Kipok/NeMo-Skills)
- [Generating synthetic data](https://github.com/Kipok/NeMo-Skills/blob/main/docs/synthetic-data-generation.md)
- [Finetuning models](https://github.com/Kipok/NeMo-Skills/blob/main/docs/finetuning.md)
- [Evaluating models](https://github.com/Kipok/NeMo-Skills/blob/main/docs/evaluation.md)
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage. | nvidia/OpenMathInstruct-1 | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:other",
"math",
"code",
"nvidia",
"arxiv:2402.10176",
"region:us"
] | 2024-02-11T23:19:47+00:00 | {"language": ["en"], "license": "other", "size_categories": ["1M<n<10M"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "OpenMathInstruct-1", "license_name": "nvidia-license", "tags": ["math", "code", "nvidia"]} | 2024-02-16T18:42:16+00:00 | [
"2402.10176"
] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-other #math #code #nvidia #arxiv-2402.10176 #region-us
|
# OpenMathInstruct-1
OpenMathInstruct-1 is a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed Mixtral-8x7B model.
The problems are from GSM8K
and MATH training subsets and the solutions
are synthetically generated by allowing Mixtral model to use a mix of text reasoning and
code blocks executed by Python interpreter.
The dataset is split into train and validation subsets that we used in the ablations experiments.
These two subsets combined together cover the full training set of GSM8K and MATH.
OpenMathInstruct-1 dataset contains of the following fields:
- question: original question from either GSM8K or MATH training set.
- generated_solution: the synthetically generated solution that uses a mix of text reasoning and code blocks.
- expected_answer: the ground-truth answer provided in the original dataset.
- predicted_answer: the answer predicted by Mixtral model in the corresponding solution (extracted from '\boxed{}').
- error_message: '<not_executed>' if code was not used. Otherwise it's empty or contains a Python exception
from the corresponding code block. A 'timeout' string indicates that code block took longer than 10 seconds to
execute. In the current dataset version we always stop generation after any error or a timeout.
- is_correct: whether the final answer was considered correct by our grading script.
- dataset: gsm8k or math.
- generation_type: 'without_reference_solution' or 'masked_reference_solution'.
We also release the masked solutions used to produce 'generation_type="masked_reference_solution"'
portion of the dataset (GSM8K-Masked,
MATH-Masked).
See our paper to learn more details!
## OpenMath models
To demonstrate the quality of this dataset, we release a series of OpenMath models
trained on this data (a combination of train and validation splits to allow comparison with prior work).
<table border="1">
<tr>
<td></td>
<td colspan="2" style="text-align: center;">greedy</td>
<td colspan="2" style="text-align: center;">majority@50</td>
</tr>
<tr>
<td style="text-align: center;">model</td>
<td style="text-align: center;">GSM8K</td>
<td style="text-align: center;">MATH</td>
<td style="text-align: center;">GMS8K</td>
<td style="text-align: center;">MATH</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-7B (<a href="URL | <a href="URL
<td style="text-align: center;">75.9</td>
<td style="text-align: center;">43.6</td>
<td style="text-align: center;">84.8</td>
<td style="text-align: center;">55.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Mistral-7B (<a href="URL | <a href="URL
<td style="text-align: center;">80.2</td>
<td style="text-align: center;">44.5</td>
<td style="text-align: center;">86.9</td>
<td style="text-align: center;">57.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-13B (<a href="URL | <a href="URL
<td style="text-align: center;">78.8</td>
<td style="text-align: center;">45.5</td>
<td style="text-align: center;">86.8</td>
<td style="text-align: center;">57.6</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-34B (<a href="URL | <a href="URL
<td style="text-align: center;">80.7</td>
<td style="text-align: center;">48.3</td>
<td style="text-align: center;">88.0</td>
<td style="text-align: center;">60.2</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-Llama2-70B (<a href="URL | <a href="URL
<td style="text-align: center;"><b>84.7</b></td>
<td style="text-align: center;">46.3</td>
<td style="text-align: center;">90.1</td>
<td style="text-align: center;">58.3</td>
</tr>
<tr>
<td style="text-align: right;">OpenMath-CodeLlama-70B (<a href="URL | <a href="URL
<td style="text-align: center;">84.6</td>
<td style="text-align: center;"><b>50.7</b></td>
<td style="text-align: center;"><b>90.8</b></td>
<td style="text-align: center;"><b>60.4</b></td>
</tr>
</table>
The pipeline we used to produce the data and models is fully open-sourced!
- Code
- Models
- Dataset
## Reproducing our results
We provide all instructions
to fully reproduce our results, including data generation.
## Generating similar datasets
To generate similar datasets for other tasks or to learn more about our code, read through the docs below.
- NeMo-Skills Pipeline
- Generating synthetic data
- Finetuning models
- Evaluating models
If you find our work useful, please consider citing us!
## License
The use of this dataset is governed by the NVIDIA License which permits commercial usage. | [
"# OpenMathInstruct-1\n\nOpenMathInstruct-1 is a math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nThe problems are from GSM8K\nand MATH training subsets and the solutions\nare synthetically generated by allowing Mixtral model to use a mix of text reasoning and\ncode blocks executed by Python interpreter.\n\nThe dataset is split into train and validation subsets that we used in the ablations experiments.\nThese two subsets combined together cover the full training set of GSM8K and MATH.\n\nOpenMathInstruct-1 dataset contains of the following fields:\n\n- question: original question from either GSM8K or MATH training set.\n- generated_solution: the synthetically generated solution that uses a mix of text reasoning and code blocks.\n- expected_answer: the ground-truth answer provided in the original dataset.\n- predicted_answer: the answer predicted by Mixtral model in the corresponding solution (extracted from '\\boxed{}').\n- error_message: '<not_executed>' if code was not used. Otherwise it's empty or contains a Python exception\n from the corresponding code block. A 'timeout' string indicates that code block took longer than 10 seconds to\n execute. In the current dataset version we always stop generation after any error or a timeout.\n- is_correct: whether the final answer was considered correct by our grading script.\n- dataset: gsm8k or math.\n- generation_type: 'without_reference_solution' or 'masked_reference_solution'.\n\nWe also release the masked solutions used to produce 'generation_type=\"masked_reference_solution\"'\nportion of the dataset (GSM8K-Masked,\nMATH-Masked).\n\nSee our paper to learn more details!",
"## OpenMath models\n\nTo demonstrate the quality of this dataset, we release a series of OpenMath models\ntrained on this data (a combination of train and validation splits to allow comparison with prior work).\n\n<table border=\"1\">\n <tr>\n <td></td>\n <td colspan=\"2\" style=\"text-align: center;\">greedy</td>\n <td colspan=\"2\" style=\"text-align: center;\">majority@50</td>\n </tr>\n <tr>\n <td style=\"text-align: center;\">model</td>\n <td style=\"text-align: center;\">GSM8K</td>\n <td style=\"text-align: center;\">MATH</td>\n <td style=\"text-align: center;\">GMS8K</td>\n <td style=\"text-align: center;\">MATH</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-7B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">75.9</td>\n <td style=\"text-align: center;\">43.6</td>\n <td style=\"text-align: center;\">84.8</td>\n <td style=\"text-align: center;\">55.6</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-Mistral-7B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">80.2</td>\n <td style=\"text-align: center;\">44.5</td>\n <td style=\"text-align: center;\">86.9</td>\n <td style=\"text-align: center;\">57.2</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-13B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">78.8</td>\n <td style=\"text-align: center;\">45.5</td>\n <td style=\"text-align: center;\">86.8</td>\n <td style=\"text-align: center;\">57.6</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-34B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">80.7</td>\n <td style=\"text-align: center;\">48.3</td>\n <td style=\"text-align: center;\">88.0</td>\n <td style=\"text-align: center;\">60.2</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-Llama2-70B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\"><b>84.7</b></td>\n <td style=\"text-align: center;\">46.3</td>\n <td style=\"text-align: center;\">90.1</td>\n <td style=\"text-align: center;\">58.3</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-70B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">84.6</td>\n <td style=\"text-align: center;\"><b>50.7</b></td>\n <td style=\"text-align: center;\"><b>90.8</b></td>\n <td style=\"text-align: center;\"><b>60.4</b></td>\n </tr>\n</table>\n\nThe pipeline we used to produce the data and models is fully open-sourced!\n\n- Code\n- Models\n- Dataset",
"## Reproducing our results\n\nWe provide all instructions\nto fully reproduce our results, including data generation.",
"## Generating similar datasets\n\nTo generate similar datasets for other tasks or to learn more about our code, read through the docs below.\n\n- NeMo-Skills Pipeline\n - Generating synthetic data\n - Finetuning models\n - Evaluating models\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1M<n<10M #language-English #license-other #math #code #nvidia #arxiv-2402.10176 #region-us \n",
"# OpenMathInstruct-1\n\nOpenMathInstruct-1 is a math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nThe problems are from GSM8K\nand MATH training subsets and the solutions\nare synthetically generated by allowing Mixtral model to use a mix of text reasoning and\ncode blocks executed by Python interpreter.\n\nThe dataset is split into train and validation subsets that we used in the ablations experiments.\nThese two subsets combined together cover the full training set of GSM8K and MATH.\n\nOpenMathInstruct-1 dataset contains of the following fields:\n\n- question: original question from either GSM8K or MATH training set.\n- generated_solution: the synthetically generated solution that uses a mix of text reasoning and code blocks.\n- expected_answer: the ground-truth answer provided in the original dataset.\n- predicted_answer: the answer predicted by Mixtral model in the corresponding solution (extracted from '\\boxed{}').\n- error_message: '<not_executed>' if code was not used. Otherwise it's empty or contains a Python exception\n from the corresponding code block. A 'timeout' string indicates that code block took longer than 10 seconds to\n execute. In the current dataset version we always stop generation after any error or a timeout.\n- is_correct: whether the final answer was considered correct by our grading script.\n- dataset: gsm8k or math.\n- generation_type: 'without_reference_solution' or 'masked_reference_solution'.\n\nWe also release the masked solutions used to produce 'generation_type=\"masked_reference_solution\"'\nportion of the dataset (GSM8K-Masked,\nMATH-Masked).\n\nSee our paper to learn more details!",
"## OpenMath models\n\nTo demonstrate the quality of this dataset, we release a series of OpenMath models\ntrained on this data (a combination of train and validation splits to allow comparison with prior work).\n\n<table border=\"1\">\n <tr>\n <td></td>\n <td colspan=\"2\" style=\"text-align: center;\">greedy</td>\n <td colspan=\"2\" style=\"text-align: center;\">majority@50</td>\n </tr>\n <tr>\n <td style=\"text-align: center;\">model</td>\n <td style=\"text-align: center;\">GSM8K</td>\n <td style=\"text-align: center;\">MATH</td>\n <td style=\"text-align: center;\">GMS8K</td>\n <td style=\"text-align: center;\">MATH</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-7B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">75.9</td>\n <td style=\"text-align: center;\">43.6</td>\n <td style=\"text-align: center;\">84.8</td>\n <td style=\"text-align: center;\">55.6</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-Mistral-7B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">80.2</td>\n <td style=\"text-align: center;\">44.5</td>\n <td style=\"text-align: center;\">86.9</td>\n <td style=\"text-align: center;\">57.2</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-13B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">78.8</td>\n <td style=\"text-align: center;\">45.5</td>\n <td style=\"text-align: center;\">86.8</td>\n <td style=\"text-align: center;\">57.6</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-34B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">80.7</td>\n <td style=\"text-align: center;\">48.3</td>\n <td style=\"text-align: center;\">88.0</td>\n <td style=\"text-align: center;\">60.2</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-Llama2-70B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\"><b>84.7</b></td>\n <td style=\"text-align: center;\">46.3</td>\n <td style=\"text-align: center;\">90.1</td>\n <td style=\"text-align: center;\">58.3</td>\n </tr>\n <tr>\n <td style=\"text-align: right;\">OpenMath-CodeLlama-70B (<a href=\"URL | <a href=\"URL\n <td style=\"text-align: center;\">84.6</td>\n <td style=\"text-align: center;\"><b>50.7</b></td>\n <td style=\"text-align: center;\"><b>90.8</b></td>\n <td style=\"text-align: center;\"><b>60.4</b></td>\n </tr>\n</table>\n\nThe pipeline we used to produce the data and models is fully open-sourced!\n\n- Code\n- Models\n- Dataset",
"## Reproducing our results\n\nWe provide all instructions\nto fully reproduce our results, including data generation.",
"## Generating similar datasets\n\nTo generate similar datasets for other tasks or to learn more about our code, read through the docs below.\n\n- NeMo-Skills Pipeline\n - Generating synthetic data\n - Finetuning models\n - Evaluating models\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] |
f6c0bb130511e0b2086884e87f6e1f8750823637 |
# Dataset Card for Evaluation run of chanwit/flux-base-optimized
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chanwit/flux-base-optimized](https://huggingface.co/chanwit/flux-base-optimized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chanwit__flux-base-optimized",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T23:31:14.212913](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized/blob/main/results_2024-02-11T23-31-14.212913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5981649622618641,
"acc_stderr": 0.0333006317784589,
"acc_norm": 0.6020819933365092,
"acc_norm_stderr": 0.033975467298082776,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5001790307121097,
"mc2_stderr": 0.015267929934854846
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.01428052266746732,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145677
},
"harness|hellaswag|10": {
"acc": 0.6072495518820952,
"acc_stderr": 0.004873640184773443,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539635,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217583,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584506,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584506
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5001790307121097,
"mc2_stderr": 0.015267929934854846
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.01169093380971267
},
"harness|gsm8k|5": {
"acc": 0.44655041698256254,
"acc_stderr": 0.01369356654974314
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chanwit__flux-base-optimized | [
"region:us"
] | 2024-02-11T23:27:41+00:00 | {"pretty_name": "Evaluation run of chanwit/flux-base-optimized", "dataset_summary": "Dataset automatically created during the evaluation run of model [chanwit/flux-base-optimized](https://huggingface.co/chanwit/flux-base-optimized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chanwit__flux-base-optimized\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-11T23:31:14.212913](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized/blob/main/results_2024-02-11T23-31-14.212913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5981649622618641,\n \"acc_stderr\": 0.0333006317784589,\n \"acc_norm\": 0.6020819933365092,\n \"acc_norm_stderr\": 0.033975467298082776,\n \"mc1\": 0.34516523867809057,\n \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5001790307121097,\n \"mc2_stderr\": 0.015267929934854846\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.01428052266746732,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145677\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6072495518820952,\n \"acc_stderr\": 0.004873640184773443,\n \"acc_norm\": 0.8173670583549094,\n \"acc_norm_stderr\": 0.0038557568514415463\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n \"acc_stderr\": 0.027666182075539635,\n \"acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.027666182075539635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217583,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584506,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584506\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5001790307121097,\n \"mc2_stderr\": 0.015267929934854846\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.01169093380971267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44655041698256254,\n \"acc_stderr\": 0.01369356654974314\n }\n}\n```", "repo_url": "https://huggingface.co/chanwit/flux-base-optimized", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|arc:challenge|25_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|arc:challenge|25_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|gsm8k|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|gsm8k|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hellaswag|10_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hellaswag|10_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T23-25-22.204907.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["**/details_harness|winogrande|5_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["**/details_harness|winogrande|5_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-11T23-31-14.212913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T23_25_22.204907", "path": ["results_2024-02-11T23-25-22.204907.parquet"]}, {"split": "2024_02_11T23_31_14.212913", "path": ["results_2024-02-11T23-31-14.212913.parquet"]}, {"split": "latest", "path": ["results_2024-02-11T23-31-14.212913.parquet"]}]}]} | 2024-02-11T23:33:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chanwit/flux-base-optimized
Dataset automatically created during the evaluation run of model chanwit/flux-base-optimized on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-11T23:31:14.212913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chanwit/flux-base-optimized\n\n\n\nDataset automatically created during the evaluation run of model chanwit/flux-base-optimized on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T23:31:14.212913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chanwit/flux-base-optimized\n\n\n\nDataset automatically created during the evaluation run of model chanwit/flux-base-optimized on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-11T23:31:14.212913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3217e3a620818baabc38e4fb6f532c93e25bafb7 | # Dataset Card for "Calc-ape210k_selftrain_experiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anonym-repos/Calc-ape210k_selftrain_experiment | [
"region:us"
] | 2024-02-11T23:29:03+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_chinese", "dtype": "string"}, {"name": "chain", "dtype": "string"}, {"name": "result", "dtype": "string"}, {"name": "result_float", "dtype": "float64"}, {"name": "equation", "dtype": "string"}, {"name": "model_checkpoint", "dtype": "string"}, {"name": "correct_1", "dtype": "string"}, {"name": "correct_2", "dtype": "string"}, {"name": "incorrect_1", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 34677213, "num_examples": 24097}], "download_size": 14617495, "dataset_size": 34677213}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T23:40:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Calc-ape210k_selftrain_experiment"
More Information needed | [
"# Dataset Card for \"Calc-ape210k_selftrain_experiment\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Calc-ape210k_selftrain_experiment\"\n\nMore Information needed"
] |
6fd6ff42c36ab6c2da298348087d59a45d7a3df6 | # Dataset Card for "Calc-ape210k_selftrain_experiment_melted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anonym-repos/Calc-ape210k_selftrain_experiment_balanced | [
"region:us"
] | 2024-02-11T23:29:32+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_chinese", "dtype": "string"}, {"name": "chain", "dtype": "string"}, {"name": "result", "dtype": "string"}, {"name": "result_float", "dtype": "float64"}, {"name": "equation", "dtype": "string"}, {"name": "model_checkpoint", "dtype": "string"}, {"name": "correct", "dtype": "string"}, {"name": "incorrect_1", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 55447279, "num_examples": 48194}], "download_size": 23378146, "dataset_size": 55447279}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T23:41:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Calc-ape210k_selftrain_experiment_melted"
More Information needed | [
"# Dataset Card for \"Calc-ape210k_selftrain_experiment_melted\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Calc-ape210k_selftrain_experiment_melted\"\n\nMore Information needed"
] |
16c9b9780edc068d1948ff58d4ce357c2709b54b | # Dataset Card for "Calc-ape210k_selftrain_experiment_prompted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anonym-repos/Calc-ape210k_selftrain_experiment_negative | [
"region:us"
] | 2024-02-11T23:30:03+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "question_chinese", "dtype": "string"}, {"name": "chain", "dtype": "string"}, {"name": "result", "dtype": "string"}, {"name": "result_float", "dtype": "float64"}, {"name": "equation", "dtype": "string"}, {"name": "model_checkpoint", "dtype": "string"}, {"name": "prediction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 43185012, "num_examples": 48194}], "download_size": 12438720, "dataset_size": 43185012}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T23:40:48+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Calc-ape210k_selftrain_experiment_prompted"
More Information needed | [
"# Dataset Card for \"Calc-ape210k_selftrain_experiment_prompted\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Calc-ape210k_selftrain_experiment_prompted\"\n\nMore Information needed"
] |
004f38fc8b6b33d884bd5bc724d6ba90851986fc | # Persian OpenAssistant-Guanaco Dataset
## About ZharfaTech
ZharfaTech is at the forefront of developing advanced Language Learning Models (LLMs) specifically for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our objective is to bridge the digital gap in services leveraging LLMs, such as content generation, translation, and customer relationship systems, by providing tailored open-source and closed-source LLM solutions. We focus on democratizing access to LLM technology for Persian language users, developers, and businesses, fostering innovation and collaboration within the community.
## Dataset Overview
This dataset is the Persian translation of the "openassistant-guanaco" dataset, originally found at [https://huggingface.co/datasets/timdettmers/openassistant-guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco). It has been translated to cater to the nuances of the Persian language, utilizing a high-performance local translation model. The translation process was completed in 12 hours using a single Nvidia GPU, ensuring a blend of speed and accuracy.
### Key Features:
- **Language:** Persian
- **Source:** Translated from "openassistant-guanaco"
- **Translation Method:** Local transitional model
- **Processing Time:** 12 hours on a single Nvidia GPU
## Objective and Scope
ZharfaTech is dedicated to enhancing the capabilities and reach of LLM technologies for the Persian language through:
- Development of fine-tuned open-source models for the Persian language.
- Creation of specialized datasets to support extensive training and refinement.
- Advanced closed-source model development for specialized solutions.
Our dual approach of fostering community collaboration and providing high-value, specialized solutions aims to advance LLM technologies for the Persian language, making significant strides towards inclusivity and accessibility in digital services.
## How to Use This Dataset
This dataset is intended for researchers, developers, and businesses interested in developing Persian language capabilities in their LLMs. It can be used to train models for a variety of applications, including but not limited to natural language understanding, content generation, and customer interaction systems.
To access and utilize this dataset, please follow the instructions below:
1. Visit our dataset page on Hugging Face: [https://huggingface.co/datasets/ZharfaTech/openassistant-guanaco-persian-instruct-fa]
2. Review the dataset documentation for details on structure and content.
3. Download the dataset using the provided Hugging Face commands or API.
## Contributing
We welcome contributions from the community to improve and expand this dataset.
## Acknowledgments
We extend our gratitude to the creators of the "openassistant-guanaco" dataset for providing the foundation for this translation. Our thanks also go to the dedicated team members who utilized their expertise to ensure the accuracy and relevance of this Persian translation.
## License
This dataset is available under an apache-2.0 license, aligning with the original "openassistant-guanaco" dataset's licensing terms. For more information, please review the license details on our dataset page.
## Contact Us
For more information about ZharfaTech and our projects, or if you have any questions regarding this dataset, please contact us at [https://zharfa.tech].
---
ZharfaTech: Empowering Persian language speakers with advanced LLM technology. | ZharfaTech/openassistant-guanaco-persian-instruct-fa | [
"task_categories:text2text-generation",
"size_categories:1K<n<10K",
"language:fa",
"license:apache-2.0",
"region:us"
] | 2024-02-11T23:46:44+00:00 | {"language": ["fa"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text2text-generation"], "pretty_name": "persian-guanaco"} | 2024-02-12T00:07:28+00:00 | [] | [
"fa"
] | TAGS
#task_categories-text2text-generation #size_categories-1K<n<10K #language-Persian #license-apache-2.0 #region-us
| # Persian OpenAssistant-Guanaco Dataset
## About ZharfaTech
ZharfaTech is at the forefront of developing advanced Language Learning Models (LLMs) specifically for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our objective is to bridge the digital gap in services leveraging LLMs, such as content generation, translation, and customer relationship systems, by providing tailored open-source and closed-source LLM solutions. We focus on democratizing access to LLM technology for Persian language users, developers, and businesses, fostering innovation and collaboration within the community.
## Dataset Overview
This dataset is the Persian translation of the "openassistant-guanaco" dataset, originally found at URL It has been translated to cater to the nuances of the Persian language, utilizing a high-performance local translation model. The translation process was completed in 12 hours using a single Nvidia GPU, ensuring a blend of speed and accuracy.
### Key Features:
- Language: Persian
- Source: Translated from "openassistant-guanaco"
- Translation Method: Local transitional model
- Processing Time: 12 hours on a single Nvidia GPU
## Objective and Scope
ZharfaTech is dedicated to enhancing the capabilities and reach of LLM technologies for the Persian language through:
- Development of fine-tuned open-source models for the Persian language.
- Creation of specialized datasets to support extensive training and refinement.
- Advanced closed-source model development for specialized solutions.
Our dual approach of fostering community collaboration and providing high-value, specialized solutions aims to advance LLM technologies for the Persian language, making significant strides towards inclusivity and accessibility in digital services.
## How to Use This Dataset
This dataset is intended for researchers, developers, and businesses interested in developing Persian language capabilities in their LLMs. It can be used to train models for a variety of applications, including but not limited to natural language understanding, content generation, and customer interaction systems.
To access and utilize this dataset, please follow the instructions below:
1. Visit our dataset page on Hugging Face: [URL
2. Review the dataset documentation for details on structure and content.
3. Download the dataset using the provided Hugging Face commands or API.
## Contributing
We welcome contributions from the community to improve and expand this dataset.
## Acknowledgments
We extend our gratitude to the creators of the "openassistant-guanaco" dataset for providing the foundation for this translation. Our thanks also go to the dedicated team members who utilized their expertise to ensure the accuracy and relevance of this Persian translation.
## License
This dataset is available under an apache-2.0 license, aligning with the original "openassistant-guanaco" dataset's licensing terms. For more information, please review the license details on our dataset page.
## Contact Us
For more information about ZharfaTech and our projects, or if you have any questions regarding this dataset, please contact us at [URL].
---
ZharfaTech: Empowering Persian language speakers with advanced LLM technology. | [
"# Persian OpenAssistant-Guanaco Dataset",
"## About ZharfaTech\nZharfaTech is at the forefront of developing advanced Language Learning Models (LLMs) specifically for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our objective is to bridge the digital gap in services leveraging LLMs, such as content generation, translation, and customer relationship systems, by providing tailored open-source and closed-source LLM solutions. We focus on democratizing access to LLM technology for Persian language users, developers, and businesses, fostering innovation and collaboration within the community.",
"## Dataset Overview\nThis dataset is the Persian translation of the \"openassistant-guanaco\" dataset, originally found at URL It has been translated to cater to the nuances of the Persian language, utilizing a high-performance local translation model. The translation process was completed in 12 hours using a single Nvidia GPU, ensuring a blend of speed and accuracy.",
"### Key Features:\n- Language: Persian\n- Source: Translated from \"openassistant-guanaco\"\n- Translation Method: Local transitional model\n- Processing Time: 12 hours on a single Nvidia GPU",
"## Objective and Scope\nZharfaTech is dedicated to enhancing the capabilities and reach of LLM technologies for the Persian language through:\n- Development of fine-tuned open-source models for the Persian language.\n- Creation of specialized datasets to support extensive training and refinement.\n- Advanced closed-source model development for specialized solutions.\n\nOur dual approach of fostering community collaboration and providing high-value, specialized solutions aims to advance LLM technologies for the Persian language, making significant strides towards inclusivity and accessibility in digital services.",
"## How to Use This Dataset\nThis dataset is intended for researchers, developers, and businesses interested in developing Persian language capabilities in their LLMs. It can be used to train models for a variety of applications, including but not limited to natural language understanding, content generation, and customer interaction systems.\n\nTo access and utilize this dataset, please follow the instructions below:\n1. Visit our dataset page on Hugging Face: [URL\n2. Review the dataset documentation for details on structure and content.\n3. Download the dataset using the provided Hugging Face commands or API.",
"## Contributing\nWe welcome contributions from the community to improve and expand this dataset.",
"## Acknowledgments\nWe extend our gratitude to the creators of the \"openassistant-guanaco\" dataset for providing the foundation for this translation. Our thanks also go to the dedicated team members who utilized their expertise to ensure the accuracy and relevance of this Persian translation.",
"## License\nThis dataset is available under an apache-2.0 license, aligning with the original \"openassistant-guanaco\" dataset's licensing terms. For more information, please review the license details on our dataset page.",
"## Contact Us\nFor more information about ZharfaTech and our projects, or if you have any questions regarding this dataset, please contact us at [URL].\n\n---\n\nZharfaTech: Empowering Persian language speakers with advanced LLM technology."
] | [
"TAGS\n#task_categories-text2text-generation #size_categories-1K<n<10K #language-Persian #license-apache-2.0 #region-us \n",
"# Persian OpenAssistant-Guanaco Dataset",
"## About ZharfaTech\nZharfaTech is at the forefront of developing advanced Language Learning Models (LLMs) specifically for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our objective is to bridge the digital gap in services leveraging LLMs, such as content generation, translation, and customer relationship systems, by providing tailored open-source and closed-source LLM solutions. We focus on democratizing access to LLM technology for Persian language users, developers, and businesses, fostering innovation and collaboration within the community.",
"## Dataset Overview\nThis dataset is the Persian translation of the \"openassistant-guanaco\" dataset, originally found at URL It has been translated to cater to the nuances of the Persian language, utilizing a high-performance local translation model. The translation process was completed in 12 hours using a single Nvidia GPU, ensuring a blend of speed and accuracy.",
"### Key Features:\n- Language: Persian\n- Source: Translated from \"openassistant-guanaco\"\n- Translation Method: Local transitional model\n- Processing Time: 12 hours on a single Nvidia GPU",
"## Objective and Scope\nZharfaTech is dedicated to enhancing the capabilities and reach of LLM technologies for the Persian language through:\n- Development of fine-tuned open-source models for the Persian language.\n- Creation of specialized datasets to support extensive training and refinement.\n- Advanced closed-source model development for specialized solutions.\n\nOur dual approach of fostering community collaboration and providing high-value, specialized solutions aims to advance LLM technologies for the Persian language, making significant strides towards inclusivity and accessibility in digital services.",
"## How to Use This Dataset\nThis dataset is intended for researchers, developers, and businesses interested in developing Persian language capabilities in their LLMs. It can be used to train models for a variety of applications, including but not limited to natural language understanding, content generation, and customer interaction systems.\n\nTo access and utilize this dataset, please follow the instructions below:\n1. Visit our dataset page on Hugging Face: [URL\n2. Review the dataset documentation for details on structure and content.\n3. Download the dataset using the provided Hugging Face commands or API.",
"## Contributing\nWe welcome contributions from the community to improve and expand this dataset.",
"## Acknowledgments\nWe extend our gratitude to the creators of the \"openassistant-guanaco\" dataset for providing the foundation for this translation. Our thanks also go to the dedicated team members who utilized their expertise to ensure the accuracy and relevance of this Persian translation.",
"## License\nThis dataset is available under an apache-2.0 license, aligning with the original \"openassistant-guanaco\" dataset's licensing terms. For more information, please review the license details on our dataset page.",
"## Contact Us\nFor more information about ZharfaTech and our projects, or if you have any questions regarding this dataset, please contact us at [URL].\n\n---\n\nZharfaTech: Empowering Persian language speakers with advanced LLM technology."
] |
83d2eb94f2787fbca21308522950e58d03531ddd | Aman Mangukiya | Amangukiya/assignment2 | [
"language:en",
"code",
"region:us"
] | 2024-02-11T23:56:16+00:00 | {"language": ["en"], "tags": ["code"]} | 2024-02-12T01:34:13+00:00 | [] | [
"en"
] | TAGS
#language-English #code #region-us
| Aman Mangukiya | [] | [
"TAGS\n#language-English #code #region-us \n"
] |
3b839d9cbd855c4a4d7351082c0ca59a16e4e2b4 |
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T16:17:26.640039](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit/blob/main/results_2024-02-12T16-17-26.640039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6063275626505344,
"acc_stderr": 0.0332967902436461,
"acc_norm": 0.6109376151305722,
"acc_norm_stderr": 0.03397264723447794,
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.6728616152193471,
"mc2_stderr": 0.015267659398484597
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520763,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.657837084246166,
"acc_stderr": 0.004734642167493353,
"acc_norm": 0.8455486954789883,
"acc_norm_stderr": 0.003606422623639926
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.01720857935778758,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.01720857935778758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381954,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765843,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.01972205893961806,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.01972205893961806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5189718482252142,
"mc1_stderr": 0.017490896405762357,
"mc2": 0.6728616152193471,
"mc2_stderr": 0.015267659398484597
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025393
},
"harness|gsm8k|5": {
"acc": 0.40333586050037906,
"acc_stderr": 0.013512654781814697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit | [
"region:us"
] | 2024-02-11T23:57:44+00:00 | {"pretty_name": "Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit", "dataset_summary": "Dataset automatically created during the evaluation run of model [robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit](https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T16:17:26.640039](https://huggingface.co/datasets/open-llm-leaderboard/details_robinsmits__Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit/blob/main/results_2024-02-12T16-17-26.640039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6063275626505344,\n \"acc_stderr\": 0.0332967902436461,\n \"acc_norm\": 0.6109376151305722,\n \"acc_norm_stderr\": 0.03397264723447794,\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.6728616152193471,\n \"mc2_stderr\": 0.015267659398484597\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520763,\n \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657837084246166,\n \"acc_stderr\": 0.004734642167493353,\n \"acc_norm\": 0.8455486954789883,\n \"acc_norm_stderr\": 0.003606422623639926\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.01720857935778758,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.01720857935778758\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.016175692013381954,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.016175692013381954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n \"acc_stderr\": 0.012635799922765843,\n \"acc_norm\": 0.4276401564537158,\n \"acc_norm_stderr\": 0.012635799922765843\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.01972205893961806,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.01972205893961806\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5189718482252142,\n \"mc1_stderr\": 0.017490896405762357,\n \"mc2\": 0.6728616152193471,\n \"mc2_stderr\": 0.015267659398484597\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025393\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \"acc_stderr\": 0.013512654781814697\n }\n}\n```", "repo_url": "https://huggingface.co/robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|arc:challenge|25_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|arc:challenge|25_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|gsm8k|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|gsm8k|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hellaswag|10_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hellaswag|10_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-11T23-55-28.657040.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["**/details_harness|winogrande|5_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["**/details_harness|winogrande|5_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T16-17-26.640039.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_11T23_55_28.657040", "path": ["results_2024-02-11T23-55-28.657040.parquet"]}, {"split": "2024_02_12T16_17_26.640039", "path": ["results_2024-02-12T16-17-26.640039.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T16-17-26.640039.parquet"]}]}]} | 2024-02-12T16:19:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit
Dataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T16:17:26.640039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit\n\n\n\nDataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T16:17:26.640039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit\n\n\n\nDataset automatically created during the evaluation run of model robinsmits/Mistral-Instruct-7B-v0.2-ChatAlpacaV2-4bit on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T16:17:26.640039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6f3b7f1ea2d74cedaa614969be29fb14390a366e |
# Dataset Card for Evaluation run of yam-peleg/Experiment9-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment9-7B](https://huggingface.co/yam-peleg/Experiment9-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment9-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T00:42:08.192431](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment9-7B/blob/main/results_2024-02-12T00-42-08.192431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.657124599246198,
"acc_stderr": 0.03199285830904183,
"acc_norm": 0.6581476060335769,
"acc_norm_stderr": 0.03263815632071338,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7042270854773415,
"mc2_stderr": 0.015001693034141303
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.01311904089772592
},
"harness|hellaswag|10": {
"acc": 0.7125074686317466,
"acc_stderr": 0.004516681953879087,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.003235941810943153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.01666979959211203,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.01666979959211203
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7042270854773415,
"mc2_stderr": 0.015001693034141303
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491902
},
"harness|gsm8k|5": {
"acc": 0.6376042456406369,
"acc_stderr": 0.013240654263574767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yam-peleg__Experiment9-7B | [
"region:us"
] | 2024-02-12T00:44:24+00:00 | {"pretty_name": "Evaluation run of yam-peleg/Experiment9-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yam-peleg/Experiment9-7B](https://huggingface.co/yam-peleg/Experiment9-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment9-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T00:42:08.192431](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment9-7B/blob/main/results_2024-02-12T00-42-08.192431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.657124599246198,\n \"acc_stderr\": 0.03199285830904183,\n \"acc_norm\": 0.6581476060335769,\n \"acc_norm_stderr\": 0.03263815632071338,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7042270854773415,\n \"mc2_stderr\": 0.015001693034141303\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7125074686317466,\n \"acc_stderr\": 0.004516681953879087,\n \"acc_norm\": 0.880601473809998,\n \"acc_norm_stderr\": 0.003235941810943153\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n \"acc_stderr\": 0.01666979959211203,\n \"acc_norm\": 0.46033519553072627,\n \"acc_norm_stderr\": 0.01666979959211203\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7042270854773415,\n \"mc2_stderr\": 0.015001693034141303\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491902\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6376042456406369,\n \"acc_stderr\": 0.013240654263574767\n }\n}\n```", "repo_url": "https://huggingface.co/yam-peleg/Experiment9-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|arc:challenge|25_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|gsm8k|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hellaswag|10_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["**/details_harness|winogrande|5_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T00-42-08.192431.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T00_42_08.192431", "path": ["results_2024-02-12T00-42-08.192431.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T00-42-08.192431.parquet"]}]}]} | 2024-02-12T00:44:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yam-peleg/Experiment9-7B
Dataset automatically created during the evaluation run of model yam-peleg/Experiment9-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T00:42:08.192431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yam-peleg/Experiment9-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment9-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T00:42:08.192431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yam-peleg/Experiment9-7B\n\n\n\nDataset automatically created during the evaluation run of model yam-peleg/Experiment9-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T00:42:08.192431(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
56f11fac4460d7dad59141ddabaeccc10c2e68e1 |
# Dataset Card for Evaluation run of rizla/rizla-11
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/rizla-11](https://huggingface.co/rizla/rizla-11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__rizla-11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T01:26:02.401376](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-11/blob/main/results_2024-02-12T01-26-02.401376.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rizla__rizla-11 | [
"region:us"
] | 2024-02-12T01:28:17+00:00 | {"pretty_name": "Evaluation run of rizla/rizla-11", "dataset_summary": "Dataset automatically created during the evaluation run of model [rizla/rizla-11](https://huggingface.co/rizla/rizla-11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__rizla-11\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T01:26:02.401376](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-11/blob/main/results_2024-02-12T01-26-02.401376.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/rizla/rizla-11", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|arc:challenge|25_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|gsm8k|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hellaswag|10_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["**/details_harness|winogrande|5_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T01-26-02.401376.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T01_26_02.401376", "path": ["results_2024-02-12T01-26-02.401376.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T01-26-02.401376.parquet"]}]}]} | 2024-02-12T01:28:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rizla/rizla-11
Dataset automatically created during the evaluation run of model rizla/rizla-11 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T01:26:02.401376(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rizla/rizla-11\n\n\n\nDataset automatically created during the evaluation run of model rizla/rizla-11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T01:26:02.401376(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rizla/rizla-11\n\n\n\nDataset automatically created during the evaluation run of model rizla/rizla-11 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T01:26:02.401376(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2310ff4a9cd367c9fcf44b092c476938e8cafdf9 | # Guanaco-1k: Lazy Llama 2 Formatting
This is a subset (1000 samples) of the excellent [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset, processed to match Llama 2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the following [colab notebook](https://colab.research.google.com/drive/1Ad7a9zMmkxuXTOh1Z7-rNSICA4dybpM2?usp=sharing).
Useful if you don't want to reformat it by yourself (e.g., using a script). It was designed for [this article](https://mlabonne.github.io/blog/posts/Fine_Tune_Your_Own_Llama_2_Model_in_a_Colab_Notebook.html) about fine-tuning a Llama 2 (chat) model in a Google Colab.
| Afjalru/loan-prediction | [
"region:us"
] | 2024-02-12T01:36:58+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1654448, "num_examples": 1000}], "download_size": 966693, "dataset_size": 1654448}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T01:52:40+00:00 | [] | [] | TAGS
#region-us
| # Guanaco-1k: Lazy Llama 2 Formatting
This is a subset (1000 samples) of the excellent 'timdettmers/openassistant-guanaco' dataset, processed to match Llama 2's prompt format as described in this article. It was created using the following colab notebook.
Useful if you don't want to reformat it by yourself (e.g., using a script). It was designed for this article about fine-tuning a Llama 2 (chat) model in a Google Colab.
| [
"# Guanaco-1k: Lazy Llama 2 Formatting\n\nThis is a subset (1000 samples) of the excellent 'timdettmers/openassistant-guanaco' dataset, processed to match Llama 2's prompt format as described in this article. It was created using the following colab notebook.\n\nUseful if you don't want to reformat it by yourself (e.g., using a script). It was designed for this article about fine-tuning a Llama 2 (chat) model in a Google Colab."
] | [
"TAGS\n#region-us \n",
"# Guanaco-1k: Lazy Llama 2 Formatting\n\nThis is a subset (1000 samples) of the excellent 'timdettmers/openassistant-guanaco' dataset, processed to match Llama 2's prompt format as described in this article. It was created using the following colab notebook.\n\nUseful if you don't want to reformat it by yourself (e.g., using a script). It was designed for this article about fine-tuning a Llama 2 (chat) model in a Google Colab."
] |
4ae1a35862cffbc21dc1a2ee30ab69246cbb8ad2 | # Dataset Card for "boolq_transformed_and_weak_labeled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | skrishna/boolq_transformed_and_weak_labeled | [
"region:us"
] | 2024-02-12T01:39:38+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4710755, "num_examples": 6392}], "download_size": 2623317, "dataset_size": 4710755}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T23:52:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "boolq_transformed_and_weak_labeled"
More Information needed | [
"# Dataset Card for \"boolq_transformed_and_weak_labeled\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"boolq_transformed_and_weak_labeled\"\n\nMore Information needed"
] |
aa7e46f64442242a3b6e4a01d64934458a864d09 | # Dataset Card for "final_c_x86_O0_exebench_numeric_full_2k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/final_c_x86_O0_exebench_numeric_full_2k | [
"region:us"
] | 2024-02-12T01:45:58+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2512589, "num_examples": 2000}], "download_size": 0, "dataset_size": 2512589}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T01:46:11+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "final_c_x86_O0_exebench_numeric_full_2k"
More Information needed | [
"# Dataset Card for \"final_c_x86_O0_exebench_numeric_full_2k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"final_c_x86_O0_exebench_numeric_full_2k\"\n\nMore Information needed"
] |
ad8022de1d8019f293478c8ecd7e4ad34f3ed318 |
# Dataset Summary
This is a protein structure and protein prediction dataset for the papers GearNet (https://arxiv.org/abs/2203.06125), ESM-GearNet (https://arxiv.org/abs/2303.06275), ESM-S (https://arxiv.org/abs/2402.05856) and ProtIR ().
The dataset is first processed by DeepFRI (https://www.nature.com/articles/s41467-021-23303-9) and then collected by CDConv (https://openreview.net/forum?id=P5Z-Zl9XJ7).
The original files can also be downloaded from the github of CDConv (https://github.com/hehefan/Continuous-Discrete-Convolution).
We upload the dataset here for the ease of downloading.
The details are shown in the following table.
| Dataset | #Train | #Valid | #Test 95% | #Test 50% | #Test 30% |
|------------------|--------|--------|----------------|---------------------|------------|
| EnzymeCommission | 15,550 | 1,729 | 1,919 | 1,117 | 720 |
| GeneOntology | 29,898 | 3,322 | 3,416 | 2,199 | 1,717 |
| Fold | 12,312 | 736 | 1,272 (Family) | 1,254 (Superfamily) | 718 (Fold) |
# Evaluation Pipeline
Please refer to ESM-S (https://github.com/DeepGraphLearning/esm-s) for the training and evaluation code.
| Oxer11/Protein-Function-Annotation | [
"language:en",
"license:apache-2.0",
"Protein Langauge Model",
"AI for Drug Discovery",
"AI for Science",
"arxiv:2203.06125",
"arxiv:2303.06275",
"arxiv:2402.05856",
"region:us"
] | 2024-02-12T02:13:13+00:00 | {"language": ["en"], "license": "apache-2.0", "tags": ["Protein Langauge Model", "AI for Drug Discovery", "AI for Science"]} | 2024-02-12T03:46:37+00:00 | [
"2203.06125",
"2303.06275",
"2402.05856"
] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #Protein Langauge Model #AI for Drug Discovery #AI for Science #arxiv-2203.06125 #arxiv-2303.06275 #arxiv-2402.05856 #region-us
| Dataset Summary
===============
This is a protein structure and protein prediction dataset for the papers GearNet (URL ESM-GearNet (URL ESM-S (URL and ProtIR ().
The dataset is first processed by DeepFRI (URL and then collected by CDConv (URL
The original files can also be downloaded from the github of CDConv (URL
We upload the dataset here for the ease of downloading.
The details are shown in the following table.
Evaluation Pipeline
===================
Please refer to ESM-S (URL for the training and evaluation code.
| [] | [
"TAGS\n#language-English #license-apache-2.0 #Protein Langauge Model #AI for Drug Discovery #AI for Science #arxiv-2203.06125 #arxiv-2303.06275 #arxiv-2402.05856 #region-us \n"
] |
52ffdf6091752811d5066e19cb94fb5fdd9316bb | # Dataset Card for "hanna-gemini-annotated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | llm-aes/hanna-gemini-annotated | [
"region:us"
] | 2024-02-12T02:40:08+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "preference", "dtype": "float64"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "time_per_example", "dtype": "float64"}, {"name": "annotator", "dtype": "string"}, {"name": "price_per_example", "dtype": "float64"}, {"name": "raw_completion", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36546903, "num_examples": 12430}], "download_size": 2071872, "dataset_size": 36546903}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T02:40:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "hanna-gemini-annotated"
More Information needed | [
"# Dataset Card for \"hanna-gemini-annotated\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"hanna-gemini-annotated\"\n\nMore Information needed"
] |
6c48581138553d59031f9e7558ca543219613752 |
# Dataset Card for Evaluation run of Gille/StrangeMerges_21-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_21-7B-slerp](https://huggingface.co/Gille/StrangeMerges_21-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_21-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T03:48:30.394342](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_21-7B-slerp/blob/main/results_2024-02-12T03-48-30.394342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6568342231733286,
"acc_stderr": 0.03200126071317064,
"acc_norm": 0.6560973113610193,
"acc_norm_stderr": 0.03267215244947336,
"mc1": 0.598531211750306,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.7380792199210964,
"mc2_stderr": 0.014513937728917603
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.742320819112628,
"acc_norm_stderr": 0.012780770562768396
},
"harness|hellaswag|10": {
"acc": 0.7183827922724557,
"acc_stderr": 0.0044886843979795,
"acc_norm": 0.8894642501493726,
"acc_norm_stderr": 0.003129155503881717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.598531211750306,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.7380792199210964,
"mc2_stderr": 0.014513937728917603
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750033
},
"harness|gsm8k|5": {
"acc": 0.711144806671721,
"acc_stderr": 0.012484219800126666
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Gille__StrangeMerges_21-7B-slerp | [
"region:us"
] | 2024-02-12T03:50:50+00:00 | {"pretty_name": "Evaluation run of Gille/StrangeMerges_21-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_21-7B-slerp](https://huggingface.co/Gille/StrangeMerges_21-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_21-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T03:48:30.394342](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_21-7B-slerp/blob/main/results_2024-02-12T03-48-30.394342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6568342231733286,\n \"acc_stderr\": 0.03200126071317064,\n \"acc_norm\": 0.6560973113610193,\n \"acc_norm_stderr\": 0.03267215244947336,\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.7380792199210964,\n \"mc2_stderr\": 0.014513937728917603\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n \"acc_norm\": 0.742320819112628,\n \"acc_norm_stderr\": 0.012780770562768396\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7183827922724557,\n \"acc_stderr\": 0.0044886843979795,\n \"acc_norm\": 0.8894642501493726,\n \"acc_norm_stderr\": 0.003129155503881717\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.598531211750306,\n \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.7380792199210964,\n \"mc2_stderr\": 0.014513937728917603\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750033\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.711144806671721,\n \"acc_stderr\": 0.012484219800126666\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_21-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|arc:challenge|25_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|gsm8k|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hellaswag|10_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T03-48-30.394342.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["**/details_harness|winogrande|5_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T03-48-30.394342.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T03_48_30.394342", "path": ["results_2024-02-12T03-48-30.394342.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T03-48-30.394342.parquet"]}]}]} | 2024-02-12T03:51:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Gille/StrangeMerges_21-7B-slerp
Dataset automatically created during the evaluation run of model Gille/StrangeMerges_21-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T03:48:30.394342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Gille/StrangeMerges_21-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_21-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T03:48:30.394342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Gille/StrangeMerges_21-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_21-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T03:48:30.394342(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
73dbe4707b028abf9705e5136aea485ee3648630 |
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the [Autogen](https://github.com/microsoft/autogen/tree/main) and multimodal **Qwen AI** project:
- [Qwen](https://github.com/QwenLM/Qwen)
- [Qwen Agent](https://github.com/QwenLM/Qwen-Agent)
- [Qwen VL Chat](https://github.com/QwenLM/Qwen-VL)
- [Qwen Audio](https://github.com/QwenLM/Qwen-Audio)
This dataset is the 2024-02-11 update for the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1075795
- Size: 1.8 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1275
### How to use the datasets
#### Load Autogen Schema Dataset
```python
from datasets import load_dataset
ds_name = (
"matlok"
"/"
"python-text-copilot-training-"
"instruct-ai-research-"
"2024-02-11"
)
dc = "autogen"
ds = load_dataset(ds_name, dc, verification_mode="no_checks")
print(f"ds={ds_name} dataset_config={dc} has {len(ds['view_schema']['file_path'])} unique python modules")
```
```
dataset_config=autogen has 130 unique python modules
```
### Schema
The instruction alpaca text with yaml response is in the **desc** column:
```json
{
"active": "bool",
"args": "string",
"args_len": "float64",
"audio_file": "string",
"audio_path": "string",
"class_bases": "string",
"class_name": "string",
"code": "string",
"code_len": "float64",
"desc": "string",
"desc_docstr": "string",
"desc_docstr_len": "float64",
"desc_len": "int64",
"docstr": "string",
"docstr_len": "int64",
"file_path": "string",
"file_type": "string",
"function_names": "string",
"gen_bytes": "int64",
"gen_data_type": "string",
"gen_mode": "string",
"gen_size": "int64",
"gen_valid": "bool",
"height": "int64",
"image_file": "string",
"image_path": "string",
"method_names": "string",
"name": "string",
"num_all_bases": "int64",
"num_bases": "int64",
"num_classes": "int64",
"num_functions": "float64",
"num_imports": "int64",
"num_methods": "float64",
"prompts": "string",
"raises": "string",
"raises_len": "float64",
"recsize": "int64",
"repo": "string",
"returns": "string",
"returns_len": "float64",
"size": "int64",
"src_object": "string",
"total_objects": "int64",
"usage": "string",
"usages": "string",
"width": "int64"
}
```
| matlok/python-text-copilot-training-instruct-ai-research-2024-02-11 | [
"task_categories:text-generation",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:1M<n<10M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"coding",
"task",
"prompt",
"response",
"yaml",
"region:us"
] | 2024-02-12T03:57:58+00:00 | {"license": ["other"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"], "task_ids": ["parsing"], "pretty_name": "2024-02-11 - python copilot instructions on how to code using alpaca and yaml", "dataset_info": [{"config_name": "autogen", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "autogen", "data_files": [{"split": "view_schema", "path": "schema/train-0001-autogen-autogen.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "coding", "task", "prompt", "response", "yaml"]} | 2024-02-12T04:48:34+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us
|
## Python Copilot Instructions on How to Code using Alpaca and Yaml
Training and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Autogen and multimodal Qwen AI project:
- Qwen
- Qwen Agent
- Qwen VL Chat
- Qwen Audio
This dataset is the 2024-02-11 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 1075795
- Size: 1.8 GB
- Data type: instruct
- Format: Introduction on code usage using alpaca and yaml response
- Number of python repos: 1275
### How to use the datasets
#### Load Autogen Schema Dataset
### Schema
The instruction alpaca text with yaml response is in the desc column:
| [
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Autogen and multimodal Qwen AI project:\n\n- Qwen\n- Qwen Agent\n- Qwen VL Chat\n- Qwen Audio\n\nThis dataset is the 2024-02-11 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1075795\n- Size: 1.8 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1275",
"### How to use the datasets",
"#### Load Autogen Schema Dataset",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #task_ids-parsing #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #coding #task #prompt #response #yaml #region-us \n",
"## Python Copilot Instructions on How to Code using Alpaca and Yaml\n\nTraining and test datasets for building coding multimodal models that understand how to use the open source GitHub projects for the Autogen and multimodal Qwen AI project:\n\n- Qwen\n- Qwen Agent\n- Qwen VL Chat\n- Qwen Audio\n\nThis dataset is the 2024-02-11 update for the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.",
"### Details\n\nEach row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.\n\n- Rows: 1075795\n- Size: 1.8 GB\n- Data type: instruct\n- Format: Introduction on code usage using alpaca and yaml response\n- Number of python repos: 1275",
"### How to use the datasets",
"#### Load Autogen Schema Dataset",
"### Schema\n\nThe instruction alpaca text with yaml response is in the desc column:"
] |
b758f671c266993080e8003893d35ba1f71f214f |
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v1](https://huggingface.co/JaeyeonKang/CCK_Asura_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T04:58:51.033818](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1/blob/main/results_2024-02-12T04-58-51.033818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7535469467828841,
"acc_stderr": 0.028473742983492905,
"acc_norm": 0.7564527472308834,
"acc_norm_stderr": 0.029025433712812198,
"mc1": 0.565483476132191,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.7174856574663107,
"mc2_stderr": 0.014605715133518151
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068749,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473848
},
"harness|hellaswag|10": {
"acc": 0.719577773351922,
"acc_stderr": 0.004482874732237349,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.003114285077228029
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.0286319518459304,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.0286319518459304
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372267,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848087,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848087
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7404255319148936,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.7404255319148936,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5343915343915344,
"acc_stderr": 0.02569032176249385,
"acc_norm": 0.5343915343915344,
"acc_norm_stderr": 0.02569032176249385
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486933,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486933
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7897435897435897,
"acc_stderr": 0.020660597485026945,
"acc_norm": 0.7897435897435897,
"acc_norm_stderr": 0.020660597485026945
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.030149135601365944,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.030149135601365944
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862086,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862086
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9240506329113924,
"acc_stderr": 0.0172446332510657,
"acc_norm": 0.9240506329113924,
"acc_norm_stderr": 0.0172446332510657
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073878,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073878
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.0314570385430625,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.0314570385430625
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971723,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813237,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813237
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.011124283175851183,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.011124283175851183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.019539014685374036,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.019539014685374036
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6636871508379888,
"acc_stderr": 0.0158010037291459,
"acc_norm": 0.6636871508379888,
"acc_norm_stderr": 0.0158010037291459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149886,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149886
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6134751773049646,
"acc_stderr": 0.02904919034254347,
"acc_norm": 0.6134751773049646,
"acc_norm_stderr": 0.02904919034254347
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.589960886571056,
"acc_stderr": 0.012561837621962032,
"acc_norm": 0.589960886571056,
"acc_norm_stderr": 0.012561837621962032
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199173,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199173
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.7174856574663107,
"mc2_stderr": 0.014605715133518151
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.0096502429002916
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251653
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1 | [
"region:us"
] | 2024-02-12T05:01:16+00:00 | {"pretty_name": "Evaluation run of JaeyeonKang/CCK_Asura_v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v1](https://huggingface.co/JaeyeonKang/CCK_Asura_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T04:58:51.033818](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1/blob/main/results_2024-02-12T04-58-51.033818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7535469467828841,\n \"acc_stderr\": 0.028473742983492905,\n \"acc_norm\": 0.7564527472308834,\n \"acc_norm_stderr\": 0.029025433712812198,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.7174856574663107,\n \"mc2_stderr\": 0.014605715133518151\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068749,\n \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473848\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719577773351922,\n \"acc_stderr\": 0.004482874732237349,\n \"acc_norm\": 0.8906592312288388,\n \"acc_norm_stderr\": 0.003114285077228029\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.0286319518459304,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.0286319518459304\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372267,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848087,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848087\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7404255319148936,\n \"acc_stderr\": 0.02865917937429232,\n \"acc_norm\": 0.7404255319148936,\n \"acc_norm_stderr\": 0.02865917937429232\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5343915343915344,\n \"acc_stderr\": 0.02569032176249385,\n \"acc_norm\": 0.5343915343915344,\n \"acc_norm_stderr\": 0.02569032176249385\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n \"acc_stderr\": 0.019469334586486933,\n \"acc_norm\": 0.864516129032258,\n \"acc_norm_stderr\": 0.019469334586486933\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7897435897435897,\n \"acc_stderr\": 0.020660597485026945,\n \"acc_norm\": 0.7897435897435897,\n \"acc_norm_stderr\": 0.020660597485026945\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.030149135601365944,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.030149135601365944\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862086,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862086\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6990740740740741,\n \"acc_stderr\": 0.031280390843298804,\n \"acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.031280390843298804\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9240506329113924,\n \"acc_stderr\": 0.0172446332510657,\n \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.0172446332510657\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073878,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073878\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.0314570385430625,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.0314570385430625\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971723,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813237,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813237\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n \"acc_stderr\": 0.011124283175851183,\n \"acc_norm\": 0.8914431673052363,\n \"acc_norm_stderr\": 0.011124283175851183\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8439306358381503,\n \"acc_stderr\": 0.019539014685374036,\n \"acc_norm\": 0.8439306358381503,\n \"acc_norm_stderr\": 0.019539014685374036\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6636871508379888,\n \"acc_stderr\": 0.0158010037291459,\n \"acc_norm\": 0.6636871508379888,\n \"acc_norm_stderr\": 0.0158010037291459\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149886,\n \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149886\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6134751773049646,\n \"acc_stderr\": 0.02904919034254347,\n \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.02904919034254347\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.589960886571056,\n \"acc_stderr\": 0.012561837621962032,\n \"acc_norm\": 0.589960886571056,\n \"acc_norm_stderr\": 0.012561837621962032\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n \"acc_stderr\": 0.019675343217199173,\n \"acc_norm\": 0.9154228855721394,\n \"acc_norm_stderr\": 0.019675343217199173\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.7174856574663107,\n \"mc2_stderr\": 0.014605715133518151\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.0096502429002916\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \"acc_stderr\": 0.012840345676251653\n }\n}\n```", "repo_url": "https://huggingface.co/JaeyeonKang/CCK_Asura_v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|arc:challenge|25_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|gsm8k|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hellaswag|10_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["**/details_harness|winogrande|5_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T04-58-51.033818.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T04_58_51.033818", "path": ["results_2024-02-12T04-58-51.033818.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T04-58-51.033818.parquet"]}]}]} | 2024-02-12T05:01:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1
Dataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T04:58:51.033818(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T04:58:51.033818(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1\n\n\n\nDataset automatically created during the evaluation run of model JaeyeonKang/CCK_Asura_v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T04:58:51.033818(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8ad28f2551dcaa2ae7cef402b26ec3c1eb7eaea6 | # Dataset Card for "final_c_x86_O0_exebench_numeric_full_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/final_c_x86_O0_exebench_numeric_full_json_cleaned | [
"region:us"
] | 2024-02-12T05:02:16+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16258218.0, "num_examples": 12833}], "download_size": 4646995, "dataset_size": 16258218.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T05:02:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "final_c_x86_O0_exebench_numeric_full_json_cleaned"
More Information needed | [
"# Dataset Card for \"final_c_x86_O0_exebench_numeric_full_json_cleaned\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"final_c_x86_O0_exebench_numeric_full_json_cleaned\"\n\nMore Information needed"
] |
c4eb4c959aa2e58ae0106e7d2202b63c29c04c62 |
# Bangumi Image Base of Reborn To Master The Blade From Hero-king To Extraordinary Squire
This is the image base of bangumi Reborn to Master the Blade From Hero-King to Extraordinary Squire, we detected 38 characters, 1790 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 12 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 14 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 20 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 7 | [Download](3/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 4 | 6 | [Download](4/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 5 | 24 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 9 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 64 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 25 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 49 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 9 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 56 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 207 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 39 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 40 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 53 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 53 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 45 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 210 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 22 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 26 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 34 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 7 | [Download](24/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 25 | 28 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 63 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 148 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 67 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 19 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 18 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 16 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 8 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 88 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 33 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 8 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 215 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/reborntomasterthebladefromherokingtoextraordinarysquire | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2024-02-12T05:16:15+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2024-02-12T07:02:33+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Reborn To Master The Blade From Hero-king To Extraordinary Squire
=======================================================================================
This is the image base of bangumi Reborn to Master the Blade From Hero-King to Extraordinary Squire, we detected 38 characters, 1790 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| [] | [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
f14028770e72a063202dea6787ee11c082f13b57 | # Medbase_data: Multilingual Medical LLM Evaluation (EN/ZH/ES/FR/AR/HI)
<p align="center">
📃 <a href="" target="_blank">Paper</a> • 🌐 <a href="https://github.com/FreedomIntelligence/Medbase" target="_blank">Github</a> • 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval" target="_blank">HuggingFace</a>
<br>
</p>

## Usage
- [Zip File](https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval/blob/main/Medbase_eval-datasets.zip)
- [Data category](https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval/tree/main/test)
- [question list](https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval/tree/main/questions): A collection of questions for training data leakage self-examination
## Data:
- EN:
- [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
- [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
- [PubMedQA](https://huggingface.co/datasets/pubmed_qa)
- [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu) Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
- ZH:
- [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test)
- [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB) Randomly sample 2,000 multiple-choice questions with single answer.
- [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu) Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology
- [CExam](https://github.com/williamliujl/CMExam) Randomly sample 2,000 multiple-choice questions
- ES: [Head_qa](https://huggingface.co/datasets/head_qa)
- FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA)
- HI: [MMLU_Medical_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic) Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
- AR: [MMLU_Medical_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi) Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
## Citation
```
@misc{medbase,
title={MedBase, Exploring the boundaries of open source LLM medical capabilities},
author={Xidong Wang, Junyin Chen, Nuo Chen, Yidong Wang, Zhiyi Zhang, Benyou Wang},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/FreedomIntelligence/Medbase}},
}
``` | FreedomIntelligence/Medbase_eval | [
"license:apache-2.0",
"region:us"
] | 2024-02-12T05:20:42+00:00 | {"license": "apache-2.0"} | 2024-02-12T11:40:35+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Medbase_data: Multilingual Medical LLM Evaluation (EN/ZH/ES/FR/AR/HI)
<p align="center">
<a href="" target="_blank">Paper</a> • <a href="URL target="_blank">Github</a> • <a href="URL target="_blank">HuggingFace</a>
<br>
</p>
!Medbase
## Usage
- Zip File
- Data category
- question list: A collection of questions for training data leakage self-examination
## Data:
- EN:
- MedQA-USMLE
- MedMCQA
- PubMedQA
- MMLU-Medical Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
- ZH:
- MedQA-MCMLE
- CMB-single Randomly sample 2,000 multiple-choice questions with single answer.
- CMMLU-Medical Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology
- CExam Randomly sample 2,000 multiple-choice questions
- ES: Head_qa
- FR: Frenchmedmcqa
- HI: MMLU_Medical_HI Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
- AR: MMLU_Medical_Ara Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
| [
"# Medbase_data: Multilingual Medical LLM Evaluation (EN/ZH/ES/FR/AR/HI)\n\n<p align=\"center\">\n <a href=\"\" target=\"_blank\">Paper</a> • <a href=\"URL target=\"_blank\">Github</a> • <a href=\"URL target=\"_blank\">HuggingFace</a> \n <br> \n</p>\n\n\n\n!Medbase",
"## Usage\n\n- Zip File\n- Data category\n- question list: A collection of questions for training data leakage self-examination",
"## Data:\n\n- EN:\n - MedQA-USMLE \n - MedMCQA\n - PubMedQA\n - MMLU-Medical Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine\n- ZH:\n - MedQA-MCMLE\n - CMB-single Randomly sample 2,000 multiple-choice questions with single answer.\n - CMMLU-Medical Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology\n - CExam Randomly sample 2,000 multiple-choice questions\n\n\n- ES: Head_qa\n- FR: Frenchmedmcqa\n- HI: MMLU_Medical_HI Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine\n- AR: MMLU_Medical_Ara Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Medbase_data: Multilingual Medical LLM Evaluation (EN/ZH/ES/FR/AR/HI)\n\n<p align=\"center\">\n <a href=\"\" target=\"_blank\">Paper</a> • <a href=\"URL target=\"_blank\">Github</a> • <a href=\"URL target=\"_blank\">HuggingFace</a> \n <br> \n</p>\n\n\n\n!Medbase",
"## Usage\n\n- Zip File\n- Data category\n- question list: A collection of questions for training data leakage self-examination",
"## Data:\n\n- EN:\n - MedQA-USMLE \n - MedMCQA\n - PubMedQA\n - MMLU-Medical Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine\n- ZH:\n - MedQA-MCMLE\n - CMB-single Randomly sample 2,000 multiple-choice questions with single answer.\n - CMMLU-Medical Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology\n - CExam Randomly sample 2,000 multiple-choice questions\n\n\n- ES: Head_qa\n- FR: Frenchmedmcqa\n- HI: MMLU_Medical_HI Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine\n- AR: MMLU_Medical_Ara Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine"
] |
c813647882722900ddf1d924252ba504c9685840 |
# Data Card for Augmented Australian Legal QA Dataset with Embeddings
## Dataset Description
This dataset serves as a precursor to the [Open Australian Legal QA dataset](https://huggingface.co/datasets/umarbutler/open-australian-legal-qa) and [Open Australian Legal QA Paraphrased Questions](https://huggingface.co/datasets/Ramitha/open-australian-legal-qa-paraphrased-questions) dataset , enriched with both retrieval and non-retrieval embeddings. It was generated to facilitate advanced NLP applications in legal question answering and information retrieval, leveraging Angle Embeddings for creating semantically rich vector representations.
## Dataset Structure and Column Types
The main dataset csv consists of 10,620 entries with the following columns:
- `question`: Paraphrased legal questions (string).
- `answer`: Answers to the questions (string).
- `text`: Additional contextual information or details (string).
- `prompt`: The prompt used for generating or paraphrasing the content (string).
- `source`: Metadata about the source document, including citation and jurisdiction (string).
- Embedding columns, each containing a 1D list with 4096 dimensions:
- `question_non_retrieval_embeddings`
- `answer_retrieval_embeddings`
- `answer_non_retrieval_embeddings`
- `question_retrieval_embeddings`
- `text_non_retrieval_embeddings`
- `text_retrieval_embeddings`
These embeddings are designed to support a range of NLP tasks, from content retrieval to deep semantic analysis.
## Loading the Main Dataset CSV
```python
import pandas as pd
# Load the dataset
df = pd.read_csv('main.csv')
print(df.head())
```
This code snippet demonstrates loading the dataframe from a CSV file.
## Loading Embedding Assets and Mapping to DataFrame
```python
import json
from tqdm import tqdm
# Load the JSON files into memory
json_files = [
'question_nr_embeddings.json',
'answer_r_embeddings.json',
'answer_nr_embeddings.json',
'question_r_embeddings.json',
'passage_nr_embeddings.json',
'passage_r_embeddings.json'
]
embeddings = {}
for file_name in tqdm(json_files):
with open(f'assets/{file_name}', 'r') as file:
embeddings[file_name] = json.load(file)
# Define the mapping from embedding file names to DataFrame columns
file_to_column_mapping = {
'question_nr_embeddings.json': ('question', 'non_retrieval'),
'answer_r_embeddings.json': ('answer', 'retrieval'),
'answer_nr_embeddings.json': ('answer', 'non_retrieval'),
'question_r_embeddings.json': ('question', 'retrieval'),
'passage_nr_embeddings.json': ('text', 'non_retrieval'),
'passage_r_embeddings.json': ('text', 'retrieval'),
}
for file_name, (column_name, retrieval_type) in tqdm(file_to_column_mapping.items()):
# Load the JSON file
with open(f'assets/{file_name}', 'r') as file:
embeddings = json.load(file)
# Define the new column name based on the file's purpose and retrieval type
new_column_name = f"{column_name}_{retrieval_type}_embeddings"
# Assign embeddings to the new column
df[new_column_name] = df[column_name].apply(lambda x: embeddings.get(x, None))
```
This process demonstrates how to load embedding vectors stored as JSON files and map them to the appropriate columns in your dataframe based on the content's nature and the type of retrieval they are intended for.easier manipulation and analysis.
| imperialwarrior/australia-legal-qa-dataset-and-embeddings | [
"region:us"
] | 2024-02-12T05:22:08+00:00 | {} | 2024-02-12T06:16:04+00:00 | [] | [] | TAGS
#region-us
|
# Data Card for Augmented Australian Legal QA Dataset with Embeddings
## Dataset Description
This dataset serves as a precursor to the Open Australian Legal QA dataset and Open Australian Legal QA Paraphrased Questions dataset , enriched with both retrieval and non-retrieval embeddings. It was generated to facilitate advanced NLP applications in legal question answering and information retrieval, leveraging Angle Embeddings for creating semantically rich vector representations.
## Dataset Structure and Column Types
The main dataset csv consists of 10,620 entries with the following columns:
- 'question': Paraphrased legal questions (string).
- 'answer': Answers to the questions (string).
- 'text': Additional contextual information or details (string).
- 'prompt': The prompt used for generating or paraphrasing the content (string).
- 'source': Metadata about the source document, including citation and jurisdiction (string).
- Embedding columns, each containing a 1D list with 4096 dimensions:
- 'question_non_retrieval_embeddings'
- 'answer_retrieval_embeddings'
- 'answer_non_retrieval_embeddings'
- 'question_retrieval_embeddings'
- 'text_non_retrieval_embeddings'
- 'text_retrieval_embeddings'
These embeddings are designed to support a range of NLP tasks, from content retrieval to deep semantic analysis.
## Loading the Main Dataset CSV
This code snippet demonstrates loading the dataframe from a CSV file.
## Loading Embedding Assets and Mapping to DataFrame
This process demonstrates how to load embedding vectors stored as JSON files and map them to the appropriate columns in your dataframe based on the content's nature and the type of retrieval they are intended URL manipulation and analysis.
| [
"# Data Card for Augmented Australian Legal QA Dataset with Embeddings",
"## Dataset Description\n\nThis dataset serves as a precursor to the Open Australian Legal QA dataset and Open Australian Legal QA Paraphrased Questions dataset , enriched with both retrieval and non-retrieval embeddings. It was generated to facilitate advanced NLP applications in legal question answering and information retrieval, leveraging Angle Embeddings for creating semantically rich vector representations.",
"## Dataset Structure and Column Types\n\nThe main dataset csv consists of 10,620 entries with the following columns:\n\n- 'question': Paraphrased legal questions (string).\n- 'answer': Answers to the questions (string).\n- 'text': Additional contextual information or details (string).\n- 'prompt': The prompt used for generating or paraphrasing the content (string).\n- 'source': Metadata about the source document, including citation and jurisdiction (string).\n- Embedding columns, each containing a 1D list with 4096 dimensions:\n - 'question_non_retrieval_embeddings'\n - 'answer_retrieval_embeddings'\n - 'answer_non_retrieval_embeddings'\n - 'question_retrieval_embeddings'\n - 'text_non_retrieval_embeddings'\n - 'text_retrieval_embeddings'\n\nThese embeddings are designed to support a range of NLP tasks, from content retrieval to deep semantic analysis.",
"## Loading the Main Dataset CSV\n\n\n\n\nThis code snippet demonstrates loading the dataframe from a CSV file.",
"## Loading Embedding Assets and Mapping to DataFrame\n\n\nThis process demonstrates how to load embedding vectors stored as JSON files and map them to the appropriate columns in your dataframe based on the content's nature and the type of retrieval they are intended URL manipulation and analysis."
] | [
"TAGS\n#region-us \n",
"# Data Card for Augmented Australian Legal QA Dataset with Embeddings",
"## Dataset Description\n\nThis dataset serves as a precursor to the Open Australian Legal QA dataset and Open Australian Legal QA Paraphrased Questions dataset , enriched with both retrieval and non-retrieval embeddings. It was generated to facilitate advanced NLP applications in legal question answering and information retrieval, leveraging Angle Embeddings for creating semantically rich vector representations.",
"## Dataset Structure and Column Types\n\nThe main dataset csv consists of 10,620 entries with the following columns:\n\n- 'question': Paraphrased legal questions (string).\n- 'answer': Answers to the questions (string).\n- 'text': Additional contextual information or details (string).\n- 'prompt': The prompt used for generating or paraphrasing the content (string).\n- 'source': Metadata about the source document, including citation and jurisdiction (string).\n- Embedding columns, each containing a 1D list with 4096 dimensions:\n - 'question_non_retrieval_embeddings'\n - 'answer_retrieval_embeddings'\n - 'answer_non_retrieval_embeddings'\n - 'question_retrieval_embeddings'\n - 'text_non_retrieval_embeddings'\n - 'text_retrieval_embeddings'\n\nThese embeddings are designed to support a range of NLP tasks, from content retrieval to deep semantic analysis.",
"## Loading the Main Dataset CSV\n\n\n\n\nThis code snippet demonstrates loading the dataframe from a CSV file.",
"## Loading Embedding Assets and Mapping to DataFrame\n\n\nThis process demonstrates how to load embedding vectors stored as JSON files and map them to the appropriate columns in your dataframe based on the content's nature and the type of retrieval they are intended URL manipulation and analysis."
] |
0f0892b91d35c920285de36e9c9d849fe02e70ef |
# Overview
Original from the sentences-transformers library.
Only for researching purposes.
Adapter by Aisuko
# Installation
```python
!pip install sentence-transformers==2.3.1
```
# Computing Embeddings for a large set of sentences
```python
import os
import csv
import time
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import http_get
if __name__=='__main__':
url='http://qim.fs.quoracdn.net/quora_duplicate_questions.tsv'
dataset_path='quora_duplicate_questions.tsv'
# max_corpus_size=50000 # max number of sentences to deal with
if not os.path.exists(dataset_path):
http_get(url, dataset_path)
# get all unique sentences from the file
corpus_sentences=set()
with open(dataset_path, encoding='utf8') as fIn:
reader=csv.DictReader(fIn, delimiter='\t', quoting=csv.QUOTE_MINIMAL)
for row in reader:
corpus_sentences.add(row['question1'])
corpus_sentences.add(row['question2'])
# if len(corpus_sentences)>=max_corpus_size:
# break
corpus_sentences=list(corpus_sentences)
model=SentenceTransformer('all-MiniLM-L6-v2').to('cuda')
model.max_seq_length=256
pool=model.start_multi_process_pool()
# computing the embeddings using the multi-process pool
emb=model.encode_multi_process(corpus_sentences, pool,batch_size=128,chunk_size=1024,normalize_embeddings=True)
print('Embeddings computed. Shape:', emb.shape)
# optional : stop the processes in the pool
model.stop_multi_process_pool(pool)
```
# Save the csv file
```python
import pandas as pd
corpus_embedding=pd.DataFrame(emb)
corpus_embedding.to_csv('quora_questions.csv',index=False)
``` | aisuko/quora_questions | [
"license:apache-2.0",
"region:us"
] | 2024-02-12T05:33:03+00:00 | {"license": "apache-2.0"} | 2024-02-12T05:46:45+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# Overview
Original from the sentences-transformers library.
Only for researching purposes.
Adapter by Aisuko
# Installation
# Computing Embeddings for a large set of sentences
# Save the csv file
| [
"# Overview\n\nOriginal from the sentences-transformers library.\n\nOnly for researching purposes.\n\nAdapter by Aisuko",
"# Installation",
"# Computing Embeddings for a large set of sentences",
"# Save the csv file"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Overview\n\nOriginal from the sentences-transformers library.\n\nOnly for researching purposes.\n\nAdapter by Aisuko",
"# Installation",
"# Computing Embeddings for a large set of sentences",
"# Save the csv file"
] |
2dbfac6d45849c8148168b983c75f08e1ff1f1c3 |
# Dataset Card for CGL-Dataset-v2
[](https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2/actions/workflows/ci.yaml)
[](https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2/actions/workflows/push_to_hub.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/liuan0803/RADM
- **Repository:** https://github.com/shunk031/huggingface-datasets_CGL-Dataset-v2
- **Paper (Preprint):** https://arxiv.org/abs/2306.09086
- **Paper (CIKM'23):** https://dl.acm.org/doi/10.1145/3583780.3615028
### Dataset Summary
CGL-Dataset V2 is a dataset for the task of automatic graphic layout design of advertising posters, containing 60,548 training samples and 1035 testing samples. It is an extension of CGL-Dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
<!-- For each of the tasks tagged for this dataset, give a brief description of the tag, metrics, and suggested models (with a link to their HuggingFace implementation if available). Give a similar description of tasks that were not covered by the structured tag set (repace the `task-category-tag` with an appropriate `other:other-task-name`).
- `task-category-tag`: The dataset can be used to train a model for [TASK NAME], which consists in [TASK DESCRIPTION]. Success on this task is typically measured by achieving a *high/low* [metric name](https://huggingface.co/metrics/metric_name). The ([model name](https://huggingface.co/model_name) or [model class](https://huggingface.co/transformers/model_doc/model_class.html)) model currently achieves the following score. *[IF A LEADERBOARD IS AVAILABLE]:* This task has an active leaderboard which can be found at [leaderboard url]() and ranks models based on [metric name](https://huggingface.co/metrics/metric_name) while also reporting [other metric name](https://huggingface.co/metrics/other_metric_name). -->
### Languages
The language data in CGL-Dataset v2 is in Chinese ([BCP-47 zh](https://www.rfc-editor.org/info/bcp47)).
## Dataset Structure
### Data Instances
To use CGL-Dataset v2 dataset, you need to download `RADM_dataset.tar.gz` that includes the poster image, text and text features via [JD Cloud](https://3.cn/10-dQKDKG) or [Google Drive](https://drive.google.com/file/d/1ezOzR7MX3MFFIfWgJmmEaqXn3iDFp2si/view?usp=sharing).
Then place the downloaded files in the following structure and specify its path.
```shell
/path/to/datasets
└── RADM_dataset.tar.gz
```
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/CGL-Dataset-v2",
data_dir="/path/to/datasets/RADM_dataset.tar.gz",
decode_rle=True, # True if Run-length Encoding (RLE) is to be decoded and converted to binary mask.
include_text_features=True, # True if RoBERTa-based text feature is to be loaded.
)
```
### Data Fields
[More Information Needed]
<!-- List and describe the fields present in the dataset. Mention their data type, and whether they are used as input or output in any of the tasks the dataset currently supports. If the data has span indices, describe their attributes, such as whether they are at the character level or word level, whether they are contiguous or not, etc. If the datasets contains example IDs, state whether they have an inherent meaning, such as a mapping to other datasets or pointing to relationships between data points.
- `example_field`: description of `example_field`
Note that the descriptions can be initialized with the **Show Markdown Data Fields** output of the [Datasets Tagging app](https://huggingface.co/spaces/huggingface/datasets-tagging), you will then only need to refine the generated descriptions. -->
### Data Splits
[More Information Needed]
<!-- Describe and name the splits in the dataset if there are more than one.
Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g. if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here.
Provide the sizes of each split. As appropriate, provide any descriptive statistics for the features, such as average length. For example:
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Input Sentences | | | |
| Average Sentence Length | | | | -->
## Dataset Creation
### Curation Rationale
[More Information Needed]
<!-- What need motivated the creation of this dataset? What are some of the reasons underlying the major choices involved in putting it together? -->
### Source Data
[More Information Needed]
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences,...) -->
#### Initial Data Collection and Normalization
[More Information Needed]
<!-- Describe the data collection process. Describe any criteria for data selection or filtering. List any key words or search terms used. If possible, include runtime information for the collection process.
If data was collected from other pre-existing datasets, link to source here and to their [Hugging Face version](https://huggingface.co/datasets/dataset_name).
If the data was modified or normalized after being collected (e.g. if the data is word-tokenized), describe the process and the tools used. -->
#### Who are the source language producers?
[More Information Needed]
<!-- State whether the data was produced by humans or machine generated. Describe the people or systems who originally created the data.
If available, include self-reported demographic or identity information for the source data creators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was created (for example, if the producers were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here.
Describe other people represented or mentioned in the data. Where possible, link to references for the information. -->
### Annotations
[More Information Needed]
<!-- If the dataset contains annotations which are not part of the initial data collection, describe them in the following paragraphs. -->
#### Annotation process
[More Information Needed]
<!-- If applicable, describe the annotation process and any tools used, or state otherwise. Describe the amount of data annotated, if not all. Describe or reference annotation guidelines provided to the annotators. If available, provide interannotator statistics. Describe any annotation validation processes. -->
#### Who are the annotators?
[More Information Needed]
<!-- If annotations were collected for the source data (such as class labels or syntactic parses), state whether the annotations were produced by humans or machine generated.
Describe the people or systems who originally created the annotations and their selection criteria if applicable.
If available, include self-reported demographic or identity information for the annotators, but avoid inferring this information. Instead state that this information is unknown. See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender.
Describe the conditions under which the data was annotated (for example, if the annotators were crowdworkers, state what platform was used, or if the data was found, what website the data was found on). If compensation was provided, include that information here. -->
### Personal and Sensitive Information
[More Information Needed]
<!-- State whether the dataset uses identity categories and, if so, how the information is used. Describe where this information comes from (i.e. self-reporting, collecting from profiles, inferring, etc.). See [Larson 2017](https://www.aclweb.org/anthology/W17-1601.pdf) for using identity categories as a variables, particularly gender. State whether the data is linked to individuals and whether those individuals can be identified in the dataset, either directly or indirectly (i.e., in combination with other data).
State whether the dataset contains other data that might be considered sensitive (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history).
If efforts were made to anonymize the data, describe the anonymization process. -->
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
<!-- Please discuss some of the ways you believe the use of this dataset will impact society.
The statement should include both positive outlooks, such as outlining how technologies developed through its use may improve people's lives, and discuss the accompanying risks. These risks may range from making important decisions more opaque to people who are affected by the technology, to reinforcing existing harmful biases (whose specifics should be discussed in the next section), among other considerations.
Also describe in this section if the proposed dataset contains a low-resource or under-represented language. If this is the case or if this task has any impact on underserved communities, please elaborate here. -->
### Discussion of Biases
[More Information Needed]
<!-- Provide descriptions of specific biases that are likely to be reflected in the data, and state whether any steps were taken to reduce their impact.
For Wikipedia text, see for example [Dinan et al 2020 on biases in Wikipedia (esp. Table 1)](https://arxiv.org/abs/2005.00614), or [Blodgett et al 2020](https://www.aclweb.org/anthology/2020.acl-main.485/) for a more general discussion of the topic.
If analyses have been run quantifying these biases, please add brief summaries and links to the studies here. -->
### Other Known Limitations
[More Information Needed]
<!-- If studies of the datasets have outlined other limitations of the dataset, such as annotation artifacts, please outline and cite them here. -->
## Additional Information
### Dataset Curators
[More Information Needed]
<!-- List the people involved in collecting the dataset and their affiliation(s). If funding information is known, include it here. -->
### Licensing Information
[More Information Needed]
<!-- Provide the license and link to the license webpage if available. -->
### Citation Information
<!-- Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{article_id,
author = {Author List},
title = {Dataset Paper Title},
journal = {Publication Venue},
year = {2525}
}
```
If the dataset has a [DOI](https://www.doi.org/), please provide it here. -->
```bibtex
@inproceedings{li2023relation,
title={Relation-Aware Diffusion Model for Controllable Poster Layout Generation},
author={Li, Fengheng and Liu, An and Feng, Wei and Zhu, Honghe and Li, Yaoyu and Zhang, Zheng and Lv, Jingjing and Zhu, Xin and Shen, Junjie and Lin, Zhangang},
booktitle={Proceedings of the 32nd ACM international conference on information & knowledge management},
pages={1249--1258},
year={2023}
}
```
### Contributions
Thanks to [@liuan0803](https://github.com/liuan0803) for creating this dataset.
| pytorch-layout-generation/CGL-Dataset-v2 | [
"task_categories:other",
"annotations_creators:crowdsourced",
"language_creators:found",
"multilinguality:monolingual",
"source_datasets:CGL-Dataset",
"language:zh",
"license:unknown",
"graphic design",
"arxiv:2306.09086",
"arxiv:2005.00614",
"region:us"
] | 2024-02-12T06:04:48+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["found"], "language": ["zh"], "license": ["unknown"], "multilinguality": ["monolingual"], "size_categories": [], "source_datasets": ["CGL-Dataset"], "task_categories": ["other"], "task_ids": [], "pretty_name": "CGL-Dataset v2", "tags": ["graphic design"]} | 2024-02-12T09:53:10+00:00 | [
"2306.09086",
"2005.00614"
] | [
"zh"
] | TAGS
#task_categories-other #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #source_datasets-CGL-Dataset #language-Chinese #license-unknown #graphic design #arxiv-2306.09086 #arxiv-2005.00614 #region-us
|
# Dataset Card for CGL-Dataset-v2
: URL
- Paper (CIKM'23): URL
### Dataset Summary
CGL-Dataset V2 is a dataset for the task of automatic graphic layout design of advertising posters, containing 60,548 training samples and 1035 testing samples. It is an extension of CGL-Dataset.
### Supported Tasks and Leaderboards
### Languages
The language data in CGL-Dataset v2 is in Chinese (BCP-47 zh).
## Dataset Structure
### Data Instances
To use CGL-Dataset v2 dataset, you need to download 'RADM_dataset.URL' that includes the poster image, text and text features via JD Cloud or Google Drive.
Then place the downloaded files in the following structure and specify its path.
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
Thanks to @liuan0803 for creating this dataset.
| [
"# Dataset Card for CGL-Dataset-v2\n\n: URL\n- Paper (CIKM'23): URL",
"### Dataset Summary\n\nCGL-Dataset V2 is a dataset for the task of automatic graphic layout design of advertising posters, containing 60,548 training samples and 1035 testing samples. It is an extension of CGL-Dataset.",
"### Supported Tasks and Leaderboards",
"### Languages\n\nThe language data in CGL-Dataset v2 is in Chinese (BCP-47 zh).",
"## Dataset Structure",
"### Data Instances\n\nTo use CGL-Dataset v2 dataset, you need to download 'RADM_dataset.URL' that includes the poster image, text and text features via JD Cloud or Google Drive.\nThen place the downloaded files in the following structure and specify its path.",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\nThanks to @liuan0803 for creating this dataset."
] | [
"TAGS\n#task_categories-other #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #source_datasets-CGL-Dataset #language-Chinese #license-unknown #graphic design #arxiv-2306.09086 #arxiv-2005.00614 #region-us \n",
"# Dataset Card for CGL-Dataset-v2\n\n: URL\n- Paper (CIKM'23): URL",
"### Dataset Summary\n\nCGL-Dataset V2 is a dataset for the task of automatic graphic layout design of advertising posters, containing 60,548 training samples and 1035 testing samples. It is an extension of CGL-Dataset.",
"### Supported Tasks and Leaderboards",
"### Languages\n\nThe language data in CGL-Dataset v2 is in Chinese (BCP-47 zh).",
"## Dataset Structure",
"### Data Instances\n\nTo use CGL-Dataset v2 dataset, you need to download 'RADM_dataset.URL' that includes the poster image, text and text features via JD Cloud or Google Drive.\nThen place the downloaded files in the following structure and specify its path.",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions\n\nThanks to @liuan0803 for creating this dataset."
] |
482abb5c8ebd8eb1348483ce0959e063ce27059e | original dataset: https://agi-labs.github.io/FuncCorr/
prompt:
```
Humans can find corresponding points for the same action between different objects.
For instance, if a person uses a pot versus a hammer to "Mash Pound", then the handle of the pot will be the
corresponding point to the handle of the hammer because they serve the same function for the action -- to hold.
Given the following two images, set the red circle on the left image as the reference point.
You are given multiple red-circled points on the right image, choices of "A, B, C, D" are drawn beside each circle.
Now select between them and find the corresponding point for the reference point, if we use both items for the action: "{action}".
Select between the choices on the right image.
```
---
license: apache-2.0
dataset_info:
features:
- name: idx
dtype: int32
- name: image
dtype: image
- name: action
dtype: string
splits:
- name: test
num_bytes: 51229948.0
num_examples: 400
download_size: 51239471
dataset_size: 51229948.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
| PerceptionEval/FunctionalCorrespondenceTest | [
"region:us"
] | 2024-02-12T06:05:49+00:00 | {} | 2024-02-12T16:10:45+00:00 | [] | [] | TAGS
#region-us
| original dataset: URL
prompt:
---
license: apache-2.0
dataset_info:
features:
- name: idx
dtype: int32
- name: image
dtype: image
- name: action
dtype: string
splits:
- name: test
num_bytes: 51229948.0
num_examples: 400
download_size: 51239471
dataset_size: 51229948.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
| [] | [
"TAGS\n#region-us \n"
] |
f6a678e69d8fc43005aa30843a81c27771e22943 | # Medbase_data: Multilingual Medical LLM Dataset (EN/ZH/ES/FR/AR/HI)
<p align="center">
📃 <a href="" target="_blank">Paper</a> • 🌐 <a href="https://github.com/FreedomIntelligence/Medbase" target="_blank">Github</a> • 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/Medbase_data" target="_blank">HuggingFace</a>
<br>
</p>

## Usage
### Data download
- [Zip File](https://huggingface.co/datasets/FreedomIntelligence/Medbase_data/blob/main/medbase_data.zip)
- [Data category](https://huggingface.co/datasets/FreedomIntelligence/Medbase_data/tree/main/train)
- Pretrain:
<details><summary>Click to expand</summary>
- json_name: {data_source}\_{language}\_{data_type}.json
- data_source: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki
- language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)
- data_type: qa(generated qa from text)
- data item:
- data_type==text: list of string
```
[
"string1",
"string2",
...
]
```
- data_type==qa: list of qa pairs(list of string)
```
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
```
</details>
- SFT:
<details><summary>Click to expand</summary>
- json_name: {data_source}\_{language}(\_clean/dup).json
- data_type: code, general, math, medicalExam, medicalPatient
- Data leakage exclusion
- clean: Data set after removing problems that appear in [Medbase_eval](https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval)
- dup: Data pairs that appear in Medbase_eval
- data item: list of qa pairs(list of string)
```
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
```
</details>
### Data leakage exclusion
[Data leakage exclusion](https://huggingface.co/datasets/FreedomIntelligence/Medbase_data/tree/main/check_exam) : Remove data that appears simultaneously in [Medbase_eval](https://huggingface.co/datasets/FreedomIntelligence/Medbase_eval) and [Mebase_data/medicalExam](https://huggingface.co/datasets/FreedomIntelligence/Medbase_data/tree/main/train/sft)
Taking into account the differences in everyone's evaluation sets, **the data in the zip file is an unchecked version for data leakage.**
**You must conduct a data leakage check before use.**
## **Data:** Huge, Diverse, Clean, Multilingual
| Data Type | Description | Source(ZH) | Source(EN) | Source(FR) | Source(ES) | Source(AR) | Source(HI) |
| ------------------ | ---------------------------- | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | -------------------------------------------------------- | ------------------------------------------------------------ | ------------------------------------------------------------ |
| Continue Pretrain | | | | | | | |
| Medical Books | Medical related Books | MedQA-books | Pile-Books | - | | | |
| Medical Guidelines | Clinical Medicine Guide | - | [Medtron guideline](https://huggingface.co/datasets/epfl-llm/guidelines) | - | | | |
| Medical Wiki | Medical related wikipedia | - | Wikipedia & Wikidoc | [CLEAR - Simple Corpus for Medical French](http://natalia.grabar.free.fr/resources.php#clear) | - | - | [Hindi_health](https://www.kaggle.com/datasets/aijain/hindi-health-dataset/data?select=Symptom+Gazetteer.txt) |
| Medical Paper | Medical related paper | Papers abstract | PubMed Abstract | [MORFITT](https://huggingface.co/datasets/qanastek/MORFITT?row=98): Pubmed-french Cochrane: [CLEAR-](http://natalia.grabar.free.fr/resources.php#clear)abs | [Mesinesp](https://zenodo.org/records/3826492) | - | - |
| Medical Web | Medical related web data | Wudao | C4 | [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA)_train | [CoWeSe](https://zenodo.org/records/5513237) | - | - |
| SFT | | | | | | | |
| Medical Exam | Medical related exams | MedQA CExam CMB (Train Set) | MedQA MedmcQA PubMedQA (Train Set) | - | [Head_qa](https://huggingface.co/datasets/head_qa)_train | - | - |
| Medical Patient | Doctor-patient dialogue data | [HuatuoGPT-I](https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT-sft-data-v1) | [PMC_patients](https://huggingface.co/datasets/zhengyun21/PMC-Patients?row=34) | - | - | [MAQA](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/Y2JBEZ) | - |
| General_Replay | General SFT Data | Wizard & ShareGPT & Alpaca | Wizard & ShareGPT & Alpaca & [Dataset List](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2#sft-data-sources) | ShareGPT & Alpaca | ShareGPT & Alpaca | ShareGPT & Alpaca | ShareGPT & Alpaca |
| Code | Code Data | [leetcode-11k](https://huggingface.co/datasets/krisfu/awesome-llm-datasets-only-Chinese) | [python_alpaca](https://huggingface.co/datasets/Vezora/Tested-22k-Python-Alpaca) | - | - | - | - |
| Math | Math Data | | [mathinstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct) | - | - | - | - |
## Citation
```
@misc{medbase,
title={MedBase, Exploring the boundaries of open source LLM medical capabilities},
author={Xidong Wang, Junyin Chen, Nuo Chen, Yidong Wang, Zhiyi Zhang, Benyou Wang},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/FreedomIntelligence/Medbase}},
}
``` | FreedomIntelligence/Medbase_data | [
"license:apache-2.0",
"region:us"
] | 2024-02-12T06:06:05+00:00 | {"license": "apache-2.0"} | 2024-02-12T11:40:04+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| Medbase\_data: Multilingual Medical LLM Dataset (EN/ZH/ES/FR/AR/HI)
===================================================================
Paper • [Github](URL target=) • [HuggingFace](URL target=)
!Medbase
Usage
-----
### Data download
* Zip File
* Data category
+ Pretrain:
Click to expand
- json\_name: {data\_source}\_{language}\_{data\_type}.json
* data\_source: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki
* language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)
* data\_type: qa(generated qa from text)
- data item:
* data\_type==text: list of string
* data\_type==qa: list of qa pairs(list of string)
+ SFT:
Click to expand
- json\_name: {data\_source}\_{language}(\_clean/dup).json
* data\_type: code, general, math, medicalExam, medicalPatient
* Data leakage exclusion
+ clean: Data set after removing problems that appear in Medbase\_eval
+ dup: Data pairs that appear in Medbase\_eval
- data item: list of qa pairs(list of string)
### Data leakage exclusion
Data leakage exclusion : Remove data that appears simultaneously in Medbase\_eval and Mebase\_data/medicalExam
Taking into account the differences in everyone's evaluation sets, the data in the zip file is an unchecked version for data leakage.
You must conduct a data leakage check before use.
Data: Huge, Diverse, Clean, Multilingual
----------------------------------------
| [
"### Data download\n\n\n* Zip File\n* Data category\n\t+ Pretrain:\n\t\n\t\n\tClick to expand\n\t\n\t\t- json\\_name: {data\\_source}\\_{language}\\_{data\\_type}.json\n\t\t\t* data\\_source: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki\n\t\t\t* language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)\n\t\t\t* data\\_type: qa(generated qa from text)\n\t\t- data item:\n\t\t\t* data\\_type==text: list of string\n\t\t\t* data\\_type==qa: list of qa pairs(list of string)\n\t+ SFT:\n\t\n\t\n\tClick to expand\n\t\n\t\t- json\\_name: {data\\_source}\\_{language}(\\_clean/dup).json\n\t\t\t* data\\_type: code, general, math, medicalExam, medicalPatient\n\t\t\t* Data leakage exclusion\n\t\t\t\t+ clean: Data set after removing problems that appear in Medbase\\_eval\n\t\t\t\t+ dup: Data pairs that appear in Medbase\\_eval\n\t\t- data item: list of qa pairs(list of string)",
"### Data leakage exclusion\n\n\nData leakage exclusion : Remove data that appears simultaneously in Medbase\\_eval and Mebase\\_data/medicalExam\n\n\nTaking into account the differences in everyone's evaluation sets, the data in the zip file is an unchecked version for data leakage.\n\n\nYou must conduct a data leakage check before use.\n\n\nData: Huge, Diverse, Clean, Multilingual\n----------------------------------------"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"### Data download\n\n\n* Zip File\n* Data category\n\t+ Pretrain:\n\t\n\t\n\tClick to expand\n\t\n\t\t- json\\_name: {data\\_source}\\_{language}\\_{data\\_type}.json\n\t\t\t* data\\_source: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki\n\t\t\t* language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)\n\t\t\t* data\\_type: qa(generated qa from text)\n\t\t- data item:\n\t\t\t* data\\_type==text: list of string\n\t\t\t* data\\_type==qa: list of qa pairs(list of string)\n\t+ SFT:\n\t\n\t\n\tClick to expand\n\t\n\t\t- json\\_name: {data\\_source}\\_{language}(\\_clean/dup).json\n\t\t\t* data\\_type: code, general, math, medicalExam, medicalPatient\n\t\t\t* Data leakage exclusion\n\t\t\t\t+ clean: Data set after removing problems that appear in Medbase\\_eval\n\t\t\t\t+ dup: Data pairs that appear in Medbase\\_eval\n\t\t- data item: list of qa pairs(list of string)",
"### Data leakage exclusion\n\n\nData leakage exclusion : Remove data that appears simultaneously in Medbase\\_eval and Mebase\\_data/medicalExam\n\n\nTaking into account the differences in everyone's evaluation sets, the data in the zip file is an unchecked version for data leakage.\n\n\nYou must conduct a data leakage check before use.\n\n\nData: Huge, Diverse, Clean, Multilingual\n----------------------------------------"
] |
d24feadbc51558b848ea47e721009829be82452c | # Dataset Card for "live_ATC_inf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | adityarra07/live_ATC_inf | [
"region:us"
] | 2024-02-12T06:10:37+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 126968658.0, "num_examples": 148}], "download_size": 87553168, "dataset_size": 126968658.0}} | 2024-02-12T06:10:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "live_ATC_inf"
More Information needed | [
"# Dataset Card for \"live_ATC_inf\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"live_ATC_inf\"\n\nMore Information needed"
] |
7189638ce0a6e5db447a9912fc460853db55940b | # Dataset Card for "live_ATC_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | adityarra07/live_ATC_DAL | [
"region:us"
] | 2024-02-12T06:15:20+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 30502481.0, "num_examples": 28}], "download_size": 26150276, "dataset_size": 30502481.0}} | 2024-02-12T06:15:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "live_ATC_test"
More Information needed | [
"# Dataset Card for \"live_ATC_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"live_ATC_test\"\n\nMore Information needed"
] |
679ada0cd83475cdfd1a1b96b204a393471e9c51 | Aditya757864/DATA | [
"task_categories:translation",
"language:en",
"license:mit",
"code",
"region:us"
] | 2024-02-12T06:19:24+00:00 | {"language": ["en"], "license": "mit", "task_categories": ["translation"], "tags": ["code"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/*.csv"}, {"split": "test", "path": "holdout/*.csv"}]}]} | 2024-02-12T06:22:21+00:00 | [] | [
"en"
] | TAGS
#task_categories-translation #language-English #license-mit #code #region-us
| [] | [
"TAGS\n#task_categories-translation #language-English #license-mit #code #region-us \n"
] |
||
4ce50351cb99bf07ffc8266d9139d16a2d031385 |
# Bangumi Image Base of So, I Can't Play H!
This is the image base of bangumi So, I Can't Play H!, we detected 21 characters, 1739 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 295 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 137 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 23 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 161 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 8 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 12 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 284 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 138 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 36 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 11 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 8 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 8 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 226 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 13 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 110 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 33 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 16 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 10 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 6 | [Download](18/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 19 | 38 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 166 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/soicantplayh | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2024-02-12T06:38:31+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2024-02-12T07:47:26+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of So, I Can't Play H!
=========================================
This is the image base of bangumi So, I Can't Play H!, we detected 21 characters, 1739 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| [] | [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
56de5f8c3df1545ffdbe8997bb8610d9d26ebc06 |
Only for reseaching purpose.
Adapter: Aisuko
More detail see https://www.kaggle.com/code/aisuko/distribution-compute-of-quora-questions-embeddings | aisuko/quora_questions_raw | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-12T06:43:33+00:00 | {"language": ["en"], "license": "apache-2.0"} | 2024-02-12T06:45:50+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #region-us
|
Only for reseaching purpose.
Adapter: Aisuko
More detail see URL | [] | [
"TAGS\n#language-English #license-apache-2.0 #region-us \n"
] |
b4e7dd1cbffd851c91b550e9cfa39f2b3d86a758 | Credit to Suikamelon | Chat-Error/short-stories | [
"task_categories:conversational",
"language:en",
"not-for-all-audiences",
"region:us"
] | 2024-02-12T06:47:36+00:00 | {"language": ["en"], "task_categories": ["conversational"], "tags": ["not-for-all-audiences"]} | 2024-02-12T15:46:31+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #language-English #not-for-all-audiences #region-us
| Credit to Suikamelon | [] | [
"TAGS\n#task_categories-conversational #language-English #not-for-all-audiences #region-us \n"
] |
3912ad1cd47d7de9b605693b232112c8bd5f01b5 | Original dataset: https://cvlab.postech.ac.kr/research/SPair-71k/
prompt:
```
Humans can find corresponding points for different objects in the same category.
For instance, if there are images of two different cats, then the left ear tip of one cat corresponds to the
left ear tip of the other cat, and the right front paw of one cat corresponds to the right front paw of the other cat.
Given the following two images, set the red circle on the left image as the reference point.
You are given multiple red-circled points on the right image, choices of "A, B, C, D" are drawn beside each circle.
Now select between them and find the corresponding point for the reference point.
Select between the choices on the right image.
```
---
license: apache-2.0
dataset_info:
features:
- name: idx
dtype: int32
- name: image
dtype: image
splits:
- name: test
num_bytes: 291364105.0
num_examples: 441
download_size: 291395001
dataset_size: 291364105.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
| PerceptionEval/SemanticCorrespondenceTest | [
"region:us"
] | 2024-02-12T07:16:38+00:00 | {} | 2024-02-12T16:11:43+00:00 | [] | [] | TAGS
#region-us
| Original dataset: URL
prompt:
---
license: apache-2.0
dataset_info:
features:
- name: idx
dtype: int32
- name: image
dtype: image
splits:
- name: test
num_bytes: 291364105.0
num_examples: 441
download_size: 291395001
dataset_size: 291364105.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
| [] | [
"TAGS\n#region-us \n"
] |
30c08755da2ed1a96832b78de3af79e8ab954041 |
# Dataset Card for Evaluation run of liminerity/binarized-ingotrix-slerp-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/binarized-ingotrix-slerp-7b](https://huggingface.co/liminerity/binarized-ingotrix-slerp-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__binarized-ingotrix-slerp-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T08:08:55.496275](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__binarized-ingotrix-slerp-7b/blob/main/results_2024-02-12T08-08-55.496275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.654568791820223,
"acc_stderr": 0.032014779599218245,
"acc_norm": 0.6539344231237445,
"acc_norm_stderr": 0.03268102047204416,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7556810534513498,
"mc2_stderr": 0.014243358378169969
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136445
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.004507768029590103,
"acc_norm": 0.8863772156940849,
"acc_norm_stderr": 0.0031670398072286853
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7556810534513498,
"mc2_stderr": 0.014243358378169969
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962524
},
"harness|gsm8k|5": {
"acc": 0.711144806671721,
"acc_stderr": 0.012484219800126668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__binarized-ingotrix-slerp-7b | [
"region:us"
] | 2024-02-12T08:11:21+00:00 | {"pretty_name": "Evaluation run of liminerity/binarized-ingotrix-slerp-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/binarized-ingotrix-slerp-7b](https://huggingface.co/liminerity/binarized-ingotrix-slerp-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__binarized-ingotrix-slerp-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T08:08:55.496275](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__binarized-ingotrix-slerp-7b/blob/main/results_2024-02-12T08-08-55.496275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.654568791820223,\n \"acc_stderr\": 0.032014779599218245,\n \"acc_norm\": 0.6539344231237445,\n \"acc_norm_stderr\": 0.03268102047204416,\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7556810534513498,\n \"mc2_stderr\": 0.014243358378169969\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136445\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n \"acc_stderr\": 0.004507768029590103,\n \"acc_norm\": 0.8863772156940849,\n \"acc_norm_stderr\": 0.0031670398072286853\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7556810534513498,\n \"mc2_stderr\": 0.014243358378169969\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.711144806671721,\n \"acc_stderr\": 0.012484219800126668\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/binarized-ingotrix-slerp-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-08-55.496275.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["**/details_harness|winogrande|5_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T08-08-55.496275.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T08_08_55.496275", "path": ["results_2024-02-12T08-08-55.496275.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T08-08-55.496275.parquet"]}]}]} | 2024-02-12T08:11:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/binarized-ingotrix-slerp-7b
Dataset automatically created during the evaluation run of model liminerity/binarized-ingotrix-slerp-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-12T08:08:55.496275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/binarized-ingotrix-slerp-7b\n\n\n\nDataset automatically created during the evaluation run of model liminerity/binarized-ingotrix-slerp-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T08:08:55.496275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/binarized-ingotrix-slerp-7b\n\n\n\nDataset automatically created during the evaluation run of model liminerity/binarized-ingotrix-slerp-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-12T08:08:55.496275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits